essential

Results 1 - 25 of 906Sort Results By: Published Date | Title | Company Name
Published By: Couchbase     Published Date: Dec 04, 2014
Interactive applications have changed dramatically over the last 15 years. In the late ‘90s, large web companies emerged with dramatic increases in scale on many dimensions: · The number of concurrent users skyrocketed as applications increasingly became accessible · via the web (and later on mobile devices). · The amount of data collected and processed soared as it became easier and increasingly · valuable to capture all kinds of data. · The amount of unstructured or semi-structured data exploded and its use became integral · to the value and richness of applications. Dealing with these issues was more and more difficult using relational database technology. The key reason is that relational databases are essentially architected to run a single machine and use a rigid, schema-based approach to modeling data. Google, Amazon, Facebook, and LinkedIn were among the first companies to discover the serious limitations of relational database technology for supporting these new application requirements. Commercial alternatives didn’t exist, so they invented new data management approaches themselves. Their pioneering work generated tremendous interest because a growing number of companies faced similar problems. Open source NoSQL database projects formed to leverage the work of the pioneers, and commercial companies associated with these projects soon followed. Today, the use of NoSQL technology is rising rapidly among Internet companies and the enterprise. It’s increasingly considered a viable alternative to relational databases, especially as more organizations recognize that operating at scale is more effectively achieved running on clusters of standard, commodity servers, and a schema-less data model is often a better approach for handling the variety and type of data most often captured and processed today.
Tags : 
database, nosql, data, data management, white paper, why nosql, couchbase
    
Couchbase
Published By: Embarcadero     Published Date: Apr 29, 2015
Everything about data has changed, but that only means that data models are even more essential to understanding that data so that businesses can know what it means. As development methodologies change to incorporate Agile workflows, data architects must adapt to ensure models stay relevant and accurate. This whitepaper describes key requirements for Agile data modeling and shows how ER/Studio supports this methodology.
Tags : 
data, data management, data modeling, agile, agile data modeling, it management
    
Embarcadero
Published By: Embarcadero     Published Date: Jul 23, 2015
Whether you’re working with relational data, schema-less (NoSQL) data, or model metadata, you need a data architecture that can actively leverage information assets for business value. The most valuable data has high quality, business context, and visibility across the organization. Check out this must-read eBook for essential insights on important data architecture topics.
Tags : 
    
Embarcadero
Published By: TopQuadrant     Published Date: Mar 21, 2015
Data management is becoming more and more central to the business model of enterprises. The time when data was looked at as little more than the byproduct of automation is long gone, and today we see enterprises vigorously engaged in trying to unlock maximum value from their data, even to the extent of directly monetizing it. Yet, many of these efforts are hampered by immature data governance and management practices stemming from a legacy that did not pay much attention to data. Part of this problem is a failure to understand that there are different types of data, and each type of data has its own special characteristics, challenges and concerns. Reference data is a special type of data. It is essentially codes whose basic job is to turn other data into meaningful business information and to provide an informational context for the wider world in which the enterprise functions. This paper discusses the challenges associated with implementing a reference data management solution and the essential components of any vision for the governance and management of reference data. It covers the following topics in some detail: · What is reference data? · Why is reference data management important? · What are the challenges of reference data management? · What are some best practices for the governance and management of reference data? · What capabilities should you look for in a reference data solution?
Tags : 
data management, data, reference data, reference data management, top quadrant, malcolm chisholm
    
TopQuadrant
Published By: TopQuadrant     Published Date: Jun 11, 2018
Data governance is a lifecycle-centric asset management activity. To understand and realize the value of data assets, it is necessary to capture information about them (their metadata) in the connected way. Capturing the meaning and context of diverse enterprise data in connection to all assets in the enterprise ecosystem is foundational to effective data governance. Therefore, a data governance environment must represent assets and their role in the enterprise using an open, extensible and “smart” approach. Knowledge graphs are the most viable and powerful way to do this. This short paper outlines how knowledge graphs are flexible, evolvable, semantic and intelligent. It is these characteristics that enable them to: • capture the description of data as an interconnected set of information that meaningfully bridges enterprise metadata silos. • deliver integrated data governance by addressing all three aspects of data governance — Executive Governance, Representative Governance, and App
Tags : 
    
TopQuadrant
Published By: MapR Technologies     Published Date: Aug 04, 2018
Legacy infrastructures simply cannot handle the workloads or power the applications that will drive business decisively forward in the years ahead. New infrastructure, new thinking and new approaches are in the offing, all driven by the mantra 'transform or die.' This book is meant for IT architects; developers and development managers; platform architects; cloud specialists; and big data specialists. For you, the goal is to help create a sense of urgency you can present to your CXOs and others whose buy-in is needed to make essential infrastructure investments along the journey to digital transformation.
Tags : 
    
MapR Technologies
Published By: Melissa Data     Published Date: Jan 18, 2018
Maintaining high quality data is essential for operational efficiency, meaningful analytics and good long-term customer relationships. But, when dealing with multiple sources of data, data quality becomes complex, so you need to know when you should build a custom data quality tools over canned solutions. To answer this question, it is important to understand the difference between rules-based data quality, where internal subject matter expertise is necessary – and active data quality, where different domain expertise and resources are required.
Tags : 
    
Melissa Data
Published By: Experian     Published Date: Mar 30, 2017
Businesses today recognize the importance of the data they hold, but a general lack of trust in the quality of their data prevents them from achieving strategic business objectives. Nearly half of organizations globally say that a lack of trust in their data contributes to increased risk of non-compliance and regulatory penalties (52%) and a downturn in customer loyalty (51%). To be of value to organizations, data needs to be trustworthy. In this report, you will read about the findings from this unique study, including: · How data powers business opportunities · Why trusted data is essential for performance · Challenges that affect data quality · The current state of data management practices · Upcoming data-related projects in 2017
Tags : 
    
Experian
Published By: Syncsort     Published Date: Jul 17, 2018
In most applications we use today, data is retrieved by the source code of the application and is then used to make decisions. The application is ultimately affected by the data, but source code determines how the application performs, how it does its work and how the data is used. Today, in a world of AI and machine learning, data has a new role – becoming essentially the source code for machine-driven insight. With AI and machine learning, the data is the core of what fuels the algorithm and drives results. Without a significant quantity of good quality data related to the problem, it’s impossible to create a useful model. Download this Whitepaper to learn why the process of identifying biases present in the data is an essential step towards debugging the data that underlies machine learning predictions and improves data quality.
Tags : 
    
Syncsort
Published By: Syncsort     Published Date: Oct 25, 2018
In most applications we use today, data is retrieved by the source code of the application and is then used to make decisions. The application is ultimately affected by the data, but source code determines how the application performs, how it does its work and how the data is used. Today, in a world of AI and machine learning, data has a new role – becoming essentially the source code for machine-driven insight. With AI and machine learning, the data is the core of what fuels the algorithm and drives results. Without a significant quantity of good quality data related to the problem, it’s impossible to create a useful model. Download this Whitepaper to learn why the process of identifying biases present in the data is an essential step towards debugging the data that underlies machine learning predictions and improves data quality.
Tags : 
    
Syncsort
Published By: DATAVERSITY     Published Date: Dec 23, 2013
Everyone in an organization relies on Metadata to do their jobs. Whenever an email is sent, a report is run, inventory is ordered, compliance procedures are verified, a new IT system is integrated, applications are executed, or essentially any other business function, process, or decision is undertaken, Metadata is facilitating in the background. If that Metadata is corrupt, missing, redundant, or unpredictable then they cannot do their jobs well, they cannot trust the data they are using, and the organization ultimately suffers at all levels. Data Stewards are the people who are use, define, cleanse, archive, analyze, and share the data that is mapped directly to the Metadata of their myriad database and application systems. If your organization does not have Data Stewards (or an inefficient Stewardship Program), you need them. This paper is sponsored by: ASG.
Tags : 
    
DATAVERSITY
Published By: ASG     Published Date: Jun 10, 2013
Everyone in an organization relies on Metadata to do their jobs. Whenever an email is sent, a report is run, inventory is ordered, compliance procedures are verified, a new IT system is integrated, applications are executed, or essentially any other business function, process, or decision is undertaken, Metadata is facilitating in the background. If that Metadata is corrupt, missing, redundant, or unpredictable then they cannot do their jobs well, they cannot trust the data they are using, and the organization ultimately suffers at all levels. Data Stewards are the people who are use, define, cleanse, archive, analyze, and share the data that is mapped directly to the Metadata of their myriad database and application systems. If your organization does not have Data Stewards (or an inefficient Stewardship Program), you need them.
Tags : 
asg, data, data management, data governance, white paper, dataversity, data steward, metadata
    
ASG
Published By: MarkLogic     Published Date: Jun 16, 2013
The primary issue discussed within this paper boils down to two disparate database reliability models: ACID vs BASE. The first (ACID) has been around for some 30+ years, is a proven industry standard for SQL-centric and other relational databases, and works remarkably well in the older, yet still extant, world of vertical scaling. The second (BASE) has only recently gained popularity over the past 10 years or so, especially with the rise of social networking, Big Data, NoSQL, and other leviathans in the new world of Data Management. BASE requirements rose out of a need for ever-expanding horizontally scaled distributed networks, with non-relational data stores, and the real-time availability constraints of web-based transaction processing. While there are now more crossovers and negotiations between the two models, they essentially represent two competing groups, with Brewer’s CAP Theorem acting as the referee in the middle forcing tough decisions on each team.
Tags : 
data, data management, unstructured data, nosql, database, acid, base, database transactioning
    
MarkLogic
Published By: Embarcadero     Published Date: Apr 23, 2015
Everything about data has changed, but that only means that data models are even more essential to understanding that data so that businesses can know what it means. As development methodologies change to incorporate Agile workflows, data architects must adapt to ensure models stay relevant and accurate. This whitepaper describes key requirements for Agile data modeling and shows how ER/Studio supports this methodology.
Tags : 
data, data management, data modeling, agile, agile data modeling, it management
    
Embarcadero
Published By: Cisco EMEA     Published Date: Mar 08, 2019
Learn what the Small Business cyber-threat landscape looks like today so your business can survive; reduce operational costs and grow securely; make security a priority for everyone, and protect your business with Cisco. As your business grows, it gets noticed and not all of the attention is welcome. More and more sophisticated criminal gangs are going after small businesses
Tags : 
    
Cisco EMEA
Published By: Cisco EMEA     Published Date: Mar 08, 2019
The Cisco Meeting and TeamCollaborationExperienceThe workplace has changed and today themost agile workforces outperform traditional setups. But the reality of modern business life is, ifyou want a culture of innovation, you have tomake it yourself.Teams need a workplace that’s focussed oninnovation and speed. Introducing Cisco WebexTeams, a platform that helps teams to do it all.Essential kit for forward-thinking smallbusinesses.
Tags : 
    
Cisco EMEA
Published By: Cisco EMEA     Published Date: Mar 08, 2019
The Cisco Meeting and Team Collaboration Experience The workplace has changed and today the most agile workforces outperform traditional set ups. But the reality of modern business life is, if you want a culture of innovation, you have to make it yourself. Teams need a workplace that’s focussed on innovation and speed. Introducing Cisco Webex Teams, a platform that helps teams to do it all. Essential kit for forward-thinking small businesses.
Tags : 
    
Cisco EMEA
Published By: Sage EMEA     Published Date: Jan 29, 2019
Enterprises must continuously change to keep ahead of the competition, reduce silos, improve connectivity and respond rapidly to a changing world. Organisations also need to drive continuous innovation with technology that helps them adapt faster. So if you’re thinking of replacing your legacy ERP system, start by asking yourself these three essential questions:
Tags : 
    
Sage EMEA
Published By: Tenable     Published Date: Feb 27, 2019
"Overwhelmed by the number of vulnerabilities your team faces? Uncertain which cyber threats pose the greatest risk to your business? You’re not alone. Cybersecurity leaders have been grappling with these challenges for years – and the problem keeps getting worse. On average, enterprises find 870 vulnerabilities per day across 960 IT assets. There just isn’t enough time or resources to fix them all. More than ever, it’s essential to know where to prioritize based on risk. Download the new whitepaper “Predictive Prioritization: How to Focus on the Vulnerabilities That Matter Most” to: -Learn how to focus on the 3% of vulnerabilities that have been – or will likely be – exploited -Uncover why CVSS is an insufficient metric for prioritization – and the key criteria you need to consider -Understand the Predictive Prioritization process, which uses machine learning to help you differentiate between real and theoretical risks Ensure you’re prioritizing the right vulnerabilities for your t
Tags : 
    
Tenable
Published By: Tenable     Published Date: Feb 27, 2019
"Overwhelmed by the number of vulnerabilities your team faces? Uncertain which cyber threats pose the greatest risk to your business? You’re not alone. Cybersecurity leaders have been grappling with these challenges for years – and the problem keeps getting worse. On average, enterprises find 870 vulnerabilities per day across 960 IT assets. There just isn’t enough time or resources to fix them all. More than ever, it’s essential to know where to prioritize based on risk. Download the new whitepaper “Predictive Prioritization: How to Focus on the Vulnerabilities That Matter Most” to: -Learn how to focus on the 3% of vulnerabilities that have been – or will likely be – exploited -Uncover why CVSS is an insufficient metric for prioritization – and the key criteria you need to consider -Understand the Predictive Prioritization process, which uses machine learning to help you differentiate between real and theoretical risks Ensure you’re prioritizing the right vulnerabilities for your t
Tags : 
    
Tenable
Published By: Epicor Software Corporation     Published Date: Jan 09, 2019
To Keep Pace With Your Customers and Competitors, It’s Time to Leave Your Legacy Software Behind. Legacy enterprise resource planning (ERP) systems struggle to keep up with the modern pace of business, and they fail to meet the changing needs of your workforce. Can your leaders quickly find and analyze vital data? Are your employees’ tasks and processes straightforward and efficient? Does your ERP software make it easier to support growth? See how manufacturers upgrading to the latest ERP software from Epicor transform their business by delivering the visibility, efficiency, and productivity essential for sustained growth.
Tags : 
    
Epicor Software Corporation
Published By: Stratasys EMEA     Published Date: Feb 21, 2019
Factory production lines know the right jig or fixture speeds production, which increases productivity. But that’s just the beginning. Well-designed tools are more ergonomic, offering both increased worker safety and productivity, as well as cost savings. Traditional machining produces heavy, costly, multi-piece tools that become an even greater liability as repetitive motion injuries erode line productivity with worker disability. Redesign means even more protracted timelines for machined parts. While essential to efficiency, accuracy and safety, jigs and fixtures are often considered a necessary evil in the overall production process. Costly, protracted timelines for machined jigs and fixtures are the culprit here, especially for the often complex designs necessary to meet unique part needs. This, along with certain complex designs that simply cannot be manufactured using traditional methods are a reality on the production floor. But there is a better way. 3D printed jigs and fixture
Tags : 
    
Stratasys EMEA
Published By: MindTouch     Published Date: Mar 08, 2019
Better with knowledge management. To operate effectively, chatbots need to be able to quickly match user intent to relevant information. This means querying the available knowledge sources for answers. As it turns out, the better organizations structure, organize, and publish this information, the better their chatbots will be. This puts next-gen knowledge management solutions square in the middle of the chatbot conversation.
Tags : 
    
MindTouch
Published By: Google Apigee     Published Date: Feb 01, 2019
We all use APIs every day. The demands of digital transformation, and the related need for platforms and ecosystems, make it essential to manage APIs throughout their life cycle. We identify the pros and cons of a wide range of API management vendors and offerings, to help you make the right choice.
Tags : 
    
Google Apigee
Published By: Docker     Published Date: Feb 21, 2019
Docker containers exploded onto the scene in 2013 as a better way to develop software and has quickly become part of the enterprise infrastructure. Organizations often start by containerizing applications -- either components of a monolithic application or new distributed applications. But containerization itself isn’t enough to become more innovative. It requires changing processes, culture and the overall organizational mindset. That makes a container platform essential to success. In this paper, we will discuss what a container platform is and why it’s a critical part of any effort to drive change and innovation in the digital economy.
Tags : 
migrate applications, containerization, application development, digital transformation, microservices, modernization, cloud computing, hybrid cloud, application security
    
Docker
Start   Previous   1 2 3 4 5 6 7 8 9 10 11 12 13 14 15    Next    End
Search      

Add Research

Get your company's research in the hands of targeted business professionals.

We use technologies such as cookies to understand how you use our site and to provide a better user experience. This includes personalizing content, using analytics and improving site operations. We may share your information about your use of our site with third parties in accordance with our Privacy Policy. You can change your cookie settings as described here at any time, but parts of our site may not function correctly without them. By continuing to use our site, you agree that we can save cookies on your device, unless you have disabled cookies.
I Accept