dataset

Results 1 - 25 of 26Sort Results By: Published Date | Title | Company Name
Published By: Cognizant     Published Date: Oct 23, 2018
In the last few years, a wave of digital technologies changed the banking landscape - social/ mobile altered the way banks engage with customers, analytics enabled hyper personalized offerings by making sense of large datasets, Cloud technologies shifted the computing paradigm from CapEx to OpEx, enabling delivery of business processes as services from third-party platforms. Now, a second wave of disruption is set to drive even more profound changes - including robotic process automation (RPA), AI, IOT instrumentation, blockchain distributed ledger and shared infrastructure, and open banking platforms controlled by application programming interfaces (API). As these technologies become commercialized, and demand increases for digitally-enabled services, we will see unprecedented disruption, as non-traditional banks and fintechs rush into all segments of the banking space. This whitepaper examines key considerations for banks as they explore value in the emerging Digital 2.0 world.
Tags : 
cognizant, banking, digital
    
Cognizant
Published By: Carbonite     Published Date: Oct 10, 2018
IT admins tasked with restoring servers or lost data during a disruption are consumed with a single-minded purpose: successful recovery. But it shouldn’t take an adverse event to underscore the importance of recovery as part of an overall backup strategy. This is especially true with large datasets. Before you consider how you’re going to back up large datasets, first consider how you may need to recover the data.
Tags : 
    
Carbonite
Published By: SAP     Published Date: May 18, 2014
In-memory technology—in which entire datasets are pre-loaded into a computer’s random access memory, alleviating the need for shuttling data between memory and disk storage every time a query is initiated—has actually been around for a number of years. However, with the onset of big data, as well as an insatiable thirst for analytics, the industry is taking a second look at this promising approach to speeding up data processing.
Tags : 
sap, big data, real time data, in memory technology, data warehousing, analytics, big data analytics, data management, business insights, architecture, business intelligence, big data tools, analytical applications
    
SAP
Published By: Amazon Web Services     Published Date: Sep 05, 2018
Amazon Redshift Spectrum—a single service that can be used in conjunction with other Amazon services and products, as well as external tools—is revolutionizing the way data is stored and queried, allowing for more complex analyses and better decision making. Spectrum allows users to query very large datasets on S3 without having to load them into Amazon Redshift. This helps address the Scalability Dilemma—with Spectrum, data storage can keep growing on S3 and still be processed. By utilizing its own compute power and memory, Spectrum handles the hard work that would normally be done by Amazon Redshift. With this service, users can now scale to accommodate larger amounts of data than the cluster would have been capable of processing with its own resources.
Tags : 
    
Amazon Web Services
Published By: Carbonite     Published Date: Apr 09, 2018
IT admins tasked with restoring servers or lost data during a disruption are consumed with a single-minded purpose: successful recovery. But it shouldn’t take an adverse event to underscore the importance of recovery as part of an overall backup strategy. This is especially true with large datasets. Before you consider how you’re going to back up large datasets, first consider how you may need to recover the data. Variables abound. Is it critical or non-critical data? A simple file deletion or a system-wide outage? A physical server running onsite or a virtual one hosted offsite? These and a handful of other criteria will determine your backup and disaster recovery (BDR) deployment. What do we mean by large? A simple question with a not-so-simple answer. If your total data footprint is 5 TB or more, that’s considered large. But what kind of data is it? How many actual files are there? How frequently do they change? How much can they be compressed? It’s likely that two different 5 TB en
Tags : 
    
Carbonite
Published By: Wasabi     Published Date: Oct 23, 2017
An explosion of data storage needs, both in terms of volume and accessibility, are unmet by first-generation storage solutions. The massive datasets being generated are un-storable due to costs and unable to be fully leveraged because of speed limitations. The needs of individual businesses, and our greater economy, demand the commoditization of cloud storage. Cloud Storage 2.0 represents a new generation of solutions that promise to turn Cloud Storage into a utility along the lines of bandwidth and electricity. Leading this evolution with high-speed, low cost, reliable cloud storage is Wasabi. In this white paper we look at the genesis and possibilities of Cloud Storage 2.0, and Wasabi’s place at its forefront. Free trial with no credit card required offer available as well.
Tags : 
wasabi, cloud storage, data storage, storage solutions
    
Wasabi
Published By: Fiserv     Published Date: Nov 07, 2017
"In today’s ever-evolving lending landscape where loan quality and risk management challenge profitability and the customer experience, technology may be the key to thriving – both now and in the future. Winning financial services institutions will be the ones that transform their business models to place loan quality and risk management at the center of their operations. To facilitate continuous life-of-loan management, inclusive of the requisite data transparency and audit trails that support loan quality and loss mitigation, these institutions will implement and automate a loan completion process. Such a process will manage data quality and access to loan data and documents throughout origination, servicing and sale on the secondary market."
Tags : 
mortgage data quality, loan quality, loan data quality, mortgage quality, loan compliance, lending compliance, mortgage compliance, trid, tila respa integrated disclosure, lending efficiency, loan automation, lending automation, mortgage automation, ucd, uniform closing dataset
    
Fiserv
Published By: Fiserv     Published Date: Nov 07, 2017
Learn how loan onboarding can become more efficient and accurate by eliminating manual data validation with automation technology that is poised to transform mortgage servicing. From end-to-end, tools can simplify workflow processes, driving time and cost efficiencies. Trained staff can be deployed to greater effect and can be crucial to eliminating servicing errors. In the process, servicers improve data quality, save time and money, and deliver a better borrower experience.
Tags : 
loan quality, loan data quality, mortgage quality, mortgage data quality, loan compliance, lending compliance, mortgage compliance, lending efficiency, loan automation, lending automation, mortgage automation, ucd, uniform closing dataset, borrower satisfaction, borrower experience, mortgage origination, mortgage origination automation, mortgage servicing, mortgage servicing automation
    
Fiserv
Published By: Fiserv     Published Date: Nov 07, 2017
"Recently, a number of factors have come together to decimate the profitability of the mortgage banking industry. To regain its footing, the industry must return to mortgage banking fundamentals. This paper carefully examines each function within the mortgage business to determine if there is a better approach that will save money and improve long-term profitability."
Tags : 
loan quality, loan data quality, mortgage quality, mortgage data quality, loan compliance, lending compliance, mortgage compliance, trid, tila respa integrated disclosure, lending efficiency, loan automation, lending automation, mortgage automation, ucd, uniform closing dataset, borrower satisfaction, borrower experience
    
Fiserv
Published By: Fiserv     Published Date: Nov 07, 2017
"Improve Loan Data Quality and Compliance from Origination to Delivery. This complimentary CEB Gartner paper helps identify process and technology issues that lead to loan defects. Learn strategies for fixing issues and recommends technologies to help lenders improve loan data quality and compliance to reduce costs and improve the borrower experience. "
Tags : 
loan quality, loan data quality, mortgage quality, mortgage data quality, loan compliance, lending compliance, mortgage compliance, trid, tila respa integrated disclosure, lending efficiency, loan automation, lending automation, mortgage automation, ucd, uniform closing dataset, borrower satisfaction, borrower experience
    
Fiserv
Published By: Fiserv     Published Date: Nov 09, 2017
Digital loan origination processes can still require significant manual support, which is often inaccurate and time-consuming. This National Mortgage News paper, sponsored by Fiserv, explains how you can improve your current loan production while reducing costs and risk of non-compliance.
Tags : 
loan quality, loan data quality, mortgage quality, mortgage data quality, loan compliance, lending compliance, mortgage compliance, trid, tila respa integrated disclosure, lending efficiency, loan automation, lending automation, mortgage automation, ucd, uniform closing dataset, borrower satisfaction, borrower experience
    
Fiserv
Published By: Waterline Data & Research Partners     Published Date: Nov 07, 2016
For many years, traditional businesses have had a systematic set of processes and practices for deploying, operating and disposing of tangible assets and some forms of intangible asset. Through significant growth in our inquiry discussions with clients, and in observing increased attention from industry regulators, Gartner now sees the recognition that information is an asset becoming increasingly pervasive. At the same time, CDOs and other data and analytics leaders must take into account both internally generated datasets and exogenous sources, such as data from partners, open data and content from data brokers and analytics marketplaces, as they come to terms with the ever-increasing quantity and complexity of information assets. This task is clearly impossible if the organization lacks a clear view of what data is available, how to access it, its fitness for purpose in the contexts in which it is needed, and who is responsible for it.
Tags : 
    
Waterline Data & Research Partners
Published By: Alteryx, Inc.     Published Date: Sep 07, 2017
To learn how to get your Tableau datasets faster, download the How To Guide “6 Steps to Faster Data Blending for Tableau.”
Tags : 
    
Alteryx, Inc.
Published By: Sage     Published Date: Jul 08, 2015
This white paper describes how ERP technology can improve efficiency by: • Standardizing and automating business processes—locally as well as across multiple locations and countries—to accelerate business operations. • Offering a fully integrated suite of business management applications that share a common dataset and extending these applications over the Internet, allowing visibility and collaboration across departments, as well as with customers, partners, suppliers, and remote users. • Providing flexible and customizable reporting to improve business reporting, analysis,and insight.
Tags : 
enterprise resource planning, erp, efficiency, operating costs, standardization, automation, business management
    
Sage
Published By: Amazon Web Services     Published Date: Nov 14, 2018
Amazon Redshift Spectrum—a single service that can be used in conjunction with other Amazon services and products, as well as external tools—is revolutionizing the way data is stored and queried, allowing for more complex analyses and better decision making. Spectrum allows users to query very large datasets on S3 without having to load them into Amazon Redshift. This helps address the Scalability Dilemma—with Spectrum, data storage can keep growing on S3 and still be processed. By utilizing its own compute power and memory, Spectrum handles the hard work that would normally be done by Amazon Redshift. With this service, users can now scale to accommodate larger amounts of data than the cluster would have been capable of processing with its own resources. This e-book aims to provide you with expert tips on how to use Amazon Redshift Spectrum to increase performance and potentially reduce the cost of your queries.
Tags : 
    
Amazon Web Services
Published By: Amazon Web Services     Published Date: Sep 05, 2018
Amazon Redshift Spectrum—a single service that can be used in conjunction with other Amazon services and products, as well as external tools—is revolutionizing the way data is stored and queried, allowing for more complex analyses and better decision making. Spectrum allows users to query very large datasets on S3 without having to load them into Amazon Redshift. This helps address the Scalability Dilemma—with Spectrum, data storage can keep growing on S3 and still be processed. By utilizing its own compute power and memory, Spectrum handles the hard work that would normally be done by Amazon Redshift. With this service, users can now scale to accommodate larger amounts of data than the cluster would have been capable of processing with its own resources.
Tags : 
    
Amazon Web Services
Published By: Vertica     Published Date: Feb 23, 2010
Ovum takes a deep-dive technology audit of Vertica's Analytic Database that is designed specifically for storing and querying large datasets.
Tags : 
ovum, vertica, analytical databases, dbms, technology audit, mpp, rdbms, analytical applications, business intelligence, data mining, information management, data warehousing
    
Vertica
Published By: HP     Published Date: Jan 16, 2015
Register below to gain exclusive access to the HP-NVIDIA® Autodesk Building Design Suite 2015 Graphics Optimization guide to help you get the most out of your workstation. Upon submission of your personal information, an HP representative will be in contact in regards to your interests and needs.
Tags : 
visualization, bim, business information, workstations, building design, viewport, autodesk, cloud datasets, optimization, best practice, business application, workflow, autodesk showcase, productivity
    
HP
Published By: HP     Published Date: Feb 11, 2015
Register below to gain exclusive access to the BIM tutorial video, where Autodesk’s Lynn Allen gives pointers on maximizing your performance and productivity with key features in BIM applications – ultimately, helping you get the most out of your workstation. Upon submission of your personal information, an HP representative will be in contact in regards to your interests and needs.
Tags : 
visualization, bim, business information, workstations, building design, viewport, autodesk, cloud datasets, optimization, best practice, business application, workflow, autodesk showcase, productivity
    
HP
Published By: Cornerstone OnDemand     Published Date: May 15, 2015
Leveraging econometric analysis of a dataset of approximately 63,000 hired employees spanning approximately 250,000 observations, this report looks not only at the measurable costs of toxic behavior such as sexual harassment, theft and fraud, but also other, equally damaging and harder-to-measure costs. The report examines these indirect costs closely, looking particularly at the toll toxic employees take on co-workers, and concludes that these costs create an even larger financial burden on businesses than the direct impact of an employee’s misbehavior
Tags : 
hidden costs, sexual harassment, toxic employees, employee behavior
    
Cornerstone OnDemand
Published By: SAS     Published Date: Mar 14, 2014
Stop to think about how - and how often - your business interacts with customers. Most organizations believe that only a small fraction of data on interactions generated are effectively put to use. Why is that? Check out this whitepaper to see.
Tags : 
sas, voc, voice of customer, visual text analytics, best practices, customer voice, sound of sentiment, text data, customer data, analytical processing, structured data, enriched dataset, reporting, automatic generation, text analytics, text mining, data exploration
    
SAS
Published By: IBM     Published Date: Feb 26, 2016
With Watson Explorer, you can keep enterprise search as the foundation and transform search into Cognitive Exploration. Leveraging technological advances such as deep search and exploration, advanced content analytics, and cognitive capabilities, IBM Watson Explorer provides a unified view of the information you need, combining data from multiple internal silos and a variety of outside datasets including social media. Stop limiting your search to traditional data sources in the new, non-traditional data world.
Tags : 
watson explorer, ibm, deep search, content analytics, enterprise software
    
IBM
Published By: MedAssurant, Inc.     Published Date: Aug 31, 2010
Case Study: Beyond "Clinically Enriched Administrative Data": Using Patient- Specific Datasets to Drive Rapid Clinical & Financial Improvements
Tags : 
medassurant, medical informatics, health care, healthcare, dataset, medical claims data, patient management, administrative datasets
    
MedAssurant, Inc.
Published By: Discovery Corps, Inc     Published Date: Jan 11, 2011
Article on how to handle missing values in a dataset during a data mining project
Tags : 
data mining, missing values, techniques, diagnosis, tips, missing data
    
Discovery Corps, Inc
Published By: Fiserv     Published Date: Nov 07, 2017
Digital loan origination processes can still require significant manual support, which is often inaccurate and time-consuming. This National Mortgage News paper, sponsored by Fiserv, explains how you can improve your current loan production while reducing costs and risk of non-compliance.
Tags : 
loan quality, loan data quality, mortgage quality, mortgage data quality, loan compliance, lending compliance, mortgage compliance, trid, tila respa integrated disclosure, lending efficiency, loan automation, lending automation, mortgage automation, ucd, uniform closing dataset, borrower satisfaction, borrower experience
    
Fiserv
Previous   1 2    Next    
Search      

Add Research

Get your company's research in the hands of targeted business professionals.