- Application ILM
- Application Retirement
- B2B Data Exchange
- Big Data
- Business Intelligence
- Cloud Computing
- Complex Event Processing
- Data Archive
- Data Governance
- Data Integration
- Data Masking
- Data Migration
- Data Privacy
- Data Quality
- Data Replication
- Data Services
- Data Validation
- Data Virtualization
- Data Warehousing
- Database Archiving
- Identity Resolution
- Informatica 9.5
- Informatica Events and Webinars
- Lean Data Management
- Master Data Management (MDM)
- Metadata Management
- Oil and Gas
- Oracle Business Intelligence Applications (OBIA)
- Proactive Monitoring
- SAP BW
- Service-Oriented Architecture (SOA)
- Test Data Management
- Ultra Messaging
Topic: Data Integration
In this TDWI Checklist Report, best-selling author David Loshin provides detailed recommendations for developing a value justification for Next Generation Data Integration which engages business sponsors, as well as guidance on developing a strategic IT road map for promoting the value of data architecture and data integration.
David Loshin examines the challenge of effectively translating the perception of “data as an asset” into concrete terms. This is particularly useful as IT and data management proposals draw increased scrutiny – after all, a clear value proposition is a prerequisite for getting budget allocated. But it does not stop there, it is essential to secure the ongoing support of business users. This can only be achieved by developing a strategic IT road map which is in lockstep with the current, and future needs of the business.
The seven recommendations detailed within this TDWI Checklist Report:
- Engage with business sponsors to clarify information value and prioritise data integration initiatives.
- Develop a strategic road map and program for data integration as a discipline.
- Evolve the data architecture to support key emerging technologies.
- Adjust the data integration and analytics architecture to scale with business productivity demands.
- Employ data integration and metadata tools to drive change.
- Develop tools for self service and collaboration, and encourage it.
- Promote the proper level of data governance.
|Topics Covered||Data Integration|
Data integration has traditionally meant compromise with speed coming at the expense of quality, and if quality and rapid delivery were ensured – then costs would spiral out of control. This is highlighted by Gartner’s estimate that for some organisations, data integration costs will double without adding a focus on data quality. It doesn’t have to be this way.
Next generation data integration expands the scope of data integration to meet emerging IT and business demands – eliminating traditional trade-offs and enabling data management teams to execute better, cheaper, and faster projects. Focusing on people, processes, and technologies – next generation data integration takes advantage of best practices and tools that have steadily matured with greater capabilities and reach, to drive an evolution toward the ideal of an innovative, agile, data-driven enterprise.
During this Informatica forum we will examine the IT and business pressures that are making next generation data integration a strategic imperative, outline the key characteristics of a next generation data integration environment, and provide practical guidance for making the transition.
What you will learn during this half day event:
- Why dramatic changes in data volume, variety, and velocity make the traditional approach to data integration inadequate.
- How a next generation approach transforms data integration from a project to a key business process.
- The quantifiable business impact of adopting next generation integration.
- Why next generation data integration should be a process where IT is an enabler – not a leader.
- Why it is so important that the “data quality team” expands beyond IT to include more people who actually use the data day-to-day.
- The importance of automation in improving the efficiency of the development life-cycle and the role of a metadata repository/business glossary.
Today, operational intelligence and eliminating costly decisions based on bad data, is top of mind for business and IT stakeholders. To do this, you need a way to identify potential opportunities or threats early enough and take action, and reduce the time and costs of data validation testing.
So, if you are involved in data integration projects, it’s critical that you look at your software development life cycle (SDLC), and look for ways to reduce time, cost and risk across development, test and production. However, there is no single solution that can offer significant savings with proactive monitoring and alerting based on operational and development issues, or provides automated data integration testing.
In this webinar, Ted Friedman – VP Distinguished Analyst, Gartner Research and David Lyle – VP Product Strategy, Informatica, will discuss the key considerations that you need to keep in mind for enforcing development best practices, automating data integration testing and delivering operational intelligence to stakeholders.
During this Informatica webinar you will learn about:
- How to reduce ETL testing cycles by 50 to 80 percent
- How to proactively identify and respond to data integration development and operations risks before they become issues
- How leveraging a single solution can significantly reduce time, cost and risk across development, test and production.
Daragh O Brien will provide the keynote at this prestigious event in Dublin 2 on Wednesday 20 March 2013. Daragh is Managing Director of Castlebridge Associates, author and respected thought leader internationally in Data Quality, Data Governance, and Data Protection.
Data is the new world currency and by applying a robust data management strategy, organisations can increase revenue, reduce costs, gain competitive advantage and prepare for forthcoming EU Data Protection Regulations.
But organisations should first ensure they are practicing effective information governance. And that can only exist when a business adopts a “factory approach” to data integration and data quality so it can know what data it collects, where it came from, who it came from, how it’s stored, where it’s stored, how it’s used, what it’s used for, where it is going, how many copies exist, and the quality and integrity of the data at any one time.
What you will learn during this event:
- How effective compliance and Enterprise Information Management starts with a focus on quality within your “information factories”.
- How to apply Information Quality principles and techniques to meet current and forthcoming Data Protection regulations.
- How to support the implementation of recognised standards for Data Protection governance.
- A methodology for identifying critical quality characteristics and how you measure them.
- How pervasive data quality helps create and maintain a true single view of customer – with specific reference to Irish and Gaelic identity matching. (We are also GeoDirectory certified by An Post for Irish address validation).
- Best practices for decreasing the risk of data breaches through the effective management and protection of private and sensitive data.
- How to position high quality data as central to your company’s customer-facing initiatives.
Informatica Public Sector Resource Centre: Best Practices for Delivering Customer-centric Public Services
This Informatica Public Sector Resource Centre provides valuable insight into best practices for integrating information and delivering it to any system in any context – so departments and agencies can rapidly overcome almost any data-centric challenge. These best practices will help eliminate all the different barriers that prevent government organisations from meeting their data requirements for information sharing, transparency, and accountability.
Data and the dedicated people who use it are the two most important assets for any public sector organisation. Whether data is in unstructured, structured, or archived forms – and regardless of whether it resides inside the enterprise, in partners’ systems, or in the cloud – the ability to get to it, trust it, and use it to drive effective actions is key to increasing operational efficiency and providing truly customer-centric public services.
What you will learn from this Public Sector Resource Centre:
- Best practices for rapidly integrating new form-based online applications with back office databases
- How to mitigate the potential risk of “unmanaged” data input to corporate data quality
- Best practices for integrating cloud data
- How to address the challenges of data being derived from customers/citizens, partners and suppliers in a variety of languages and even different countries
- How high-precision Identity Data can help create a true single view of the customer/citizen
- How to rapidly build data services so data, rules and policies can be reused throughout the enterprise
- The role of an Integration Competency Center within a shared service environment
- How to maximise the value of existing technology investments
- Why collaboration between business and IT is pivotal in delivering truly customer-centric public services
- How pervasive data quality helps create and maintain a true single view of customer/citizen