- Application ILM
- Application Retirement
- B2B Data Exchange
- Big Data
- Business Intelligence
- Cloud Computing
- Complex Event Processing
- Data Archive
- Data Governance
- Data Integration
- Data Masking
- Data Migration
- Data Privacy
- Data Quality
- Data Replication
- Data Services
- Data Validation
- Data Virtualization
- Data Warehousing
- Database Archiving
- Identity Resolution
- Informatica 9.5
- Informatica Events and Webinars
- Lean Data Management
- Master Data Management (MDM)
- Metadata Management
- Oil and Gas
- Oracle Business Intelligence Applications (OBIA)
- Proactive Monitoring
- SAP BW
- Service-Oriented Architecture (SOA)
- Test Data Management
- Ultra Messaging
Topic: Data Warehousing
In this white paper, best-selling author Dr. Ralph Kimball provides detailed guidance for designing, administering, and deploying an Enterprise Data Warehouse (EDW) to support business analytics in the era of big data.
Dr. Kimball examines the similarities and differences of 20 use cases for big data analytics – summarising the salient structure and processing characteristics for each. This is an important exercise because any given enterprise is increasingly likely to encounter more than one of these use cases. It is also important to draw on the lessons learned from the early data warehousing era – while also gaining a thorough understanding of the new culture, technical skills, techniques, systems and organisational changes that will be required for big data analytics over the coming decade.
The key topics covered within this Kimball Group white paper include:
- Why data should be considered as an asset on the balance sheet.
- The differences between conventional Relational Database Management Systems (RDMS) and MapReduce/Hadoop systems – and why there is a need for these to coexist.
- The importance of sandboxes and why they should form part of a cross-divisional analytics community.
- Why it will become increasingly important for Complex Event Processing and EDW to share data and work more closely together in the coming decade.
- How the use of light touch data will expand over the next decade.
Data integration has traditionally meant compromise with speed coming at the expense of quality, and if quality and rapid delivery were ensured – then costs would spiral out of control. This is highlighted by Gartner’s estimate that for some organisations, data integration costs will double without adding a focus on data quality. It doesn’t have to be this way.
Next generation data integration expands the scope of data integration to meet emerging IT and business demands – eliminating traditional trade-offs and enabling data management teams to execute better, cheaper, and faster projects. Focusing on people, processes, and technologies – next generation data integration takes advantage of best practices and tools that have steadily matured with greater capabilities and reach, to drive an evolution toward the ideal of an innovative, agile, data-driven enterprise.
During this Informatica forum we will examine the IT and business pressures that are making next generation data integration a strategic imperative, outline the key characteristics of a next generation data integration environment, and provide practical guidance for making the transition.
What you will learn during this half day event:
- Why dramatic changes in data volume, variety, and velocity make the traditional approach to data integration inadequate.
- How a next generation approach transforms data integration from a project to a key business process.
- The quantifiable business impact of adopting next generation integration.
- Why next generation data integration should be a process where IT is an enabler – not a leader.
- Why it is so important that the “data quality team” expands beyond IT to include more people who actually use the data day-to-day.
- The importance of automation in improving the efficiency of the development life-cycle and the role of a metadata repository/business glossary.
In this white paper, best-selling author Dr. Ralph Kimball details more than twenty big data best practices spanning four categories including management, architecture, modelling and governance. Some of the best practices are recognisable extensions from the EDW/BI world, but others are wholly new insights unique to big data.
Big data dramatically expands IT’s scope of responsibility with new data types, new methods of analysis, new storage and processing platforms, and new user expectations. Now that we have almost a decade of experience with big data, it is time to review the best practices that have emerged during this time of dynamic change.
The best practices captured within this Kimball Group white paper include:
- Why big data environments should be structured around analytics, not ad hoc querying or standard reporting.
- Why you should not attempt to build a legacy big data environment at this time.
- The importance of embracing sandbox silos and why you should build a practice of productionising sandbox results.
- How to plan a logical “data highway” with multiple caches of increasing latency.
- The advantages of performing big data prototyping on a public cloud before moving to a private cloud.
- The importance of thinking dimensionally by dividing the world into dimensions and facts – before applying governance.
Insurance Industry Webinar Featuring Forrester Research: Delivering on the Promise of Data Growth in Today’s Insurance Industry
Soft market conditions, ongoing commoditisation of traditional insurance products, changing consumer behaviours, and dramatic regulatory changes have made it more difficult for insurance providers to attract and retain customers and grow revenue. These challenges are compounded by the exploding volume, variety, and velocity of data needed to gain actionable insights and meet business demands.
Join Peter Ku, Director of Financial Services Solution Marketing, Informatica and guest speaker Ellen Carney, Senior Analyst, Forrester Research, Inc. for this webinar to learn the strategies leading insurance firms are undertaking to modernise their IT environments. They will discuss:
- Market trends and challenges insurance providers will face in 2013.
- Key technologies that improve business insight and create actionable intelligence.
- The current state of data governance in today’s insurance industry.
- How to make smart investments and avoid technology pitfalls in the era of Big Data.
Join Lean Integration author David Lyle, George Yuhasz from HealthNow NY and Rob Myers from Corporate Technologies for a webinar to hear how world-class organisations are applying lean principles and new technologies to re-engineer their processes and architecture. Discover how your organisation can achieve unparalleled agility, lower costs, greater scalability and real business impact.
Our panel of experts will discuss how you can…
- Unwind the data integration hairball that makes changing reports and warehouses so time-consuming
- Cut the time and cost of data warehouse projects in half
- Give end users complete real-time visibility and trust in the data.
Don’t miss this opportunity to hear proven, real-world strategies for data integration and data warehousing that meet and exceed today’s organisation goals.