You are on page 1of 23

By PenchalaRaju.

Yanamala

Version 8.6 New Features and Enhancements

Command Line Programs


New infacmd commands. The infacmd command line program includes two
new commands, CreateRTMService and UpdateRTMService. For more
information, see the PowerCenter Command Reference.

Code Pages
Compatibility. PowerCenter, PowerExchange, and Complex Data Exchange
support the same code pages.

Datatypes
Byte. PowerCenter imports the Informix Byte datatype as a Binary
transformation datatype. The minimum value of the Byte datatype is 1 byte.
There is no maximum value. For more information, see the PowerCenter
Designer Guide.
Uniqueidentifier. PowerCenter imports the Microsoft SQL Server
uniqueidentifier datatype as a Microsoft SQL Server Varchar datatype of 38
characters. For more information, see the PowerCenter Designer Guide.

Designer
Workflow Generation Wizard. You can use the Workflow Generation Wizard to
create workflows and sessions from a mapping. For more information, see the
PowerCenter Designer Guide.

Functions
INSTR function string comparisons. When the Integration Service runs in
Unicode mode, you can use the comparison_type argument to specify whether
the INSTR function performs linguistic or binary string comparisons. For more
information, see the PowerCenter Transformation Language Reference.

Integration Service

Logging

Access log files with the high availability option. If you run a session on a
primary node that becomes unavailable, the Log Manager on the backup node
can access log files that are stored in a shared location specified by an absolute
path. For more information, see the PowerCenter Administrator Guide.

Parameters and Variables


Get the workflow run ID. Use a built-in mapping variable, session parameter,
or workflow variable to get the workflow run ID. For more information, see the
PowerCenter Workflow Administration Guide.

Partitioning

Based on number of CPUs. You can configure dynamic partitioning to set the
number of partitions equal to the number of CPUs on the node that prepares the
session. If the session is configured to run on a grid, dynamic partitioning sets
the number of partitions equal to the number of CPUs on the node that prepares
the session multiplied by the number of nodes in the grid. For more information,
see the PowerCenter Workflow Administration Guide.

Performance

View percentage of time a thread spends in each transformation. The


session log includes details about the percentage of time the Integration Service
took to process each transformation in a thread. For more information, see the
PowerCenter Performance Tuning Guide.

Real-time Recovery

Improved recovery. The Integration Service writes real-time session recovery


information to a queue. The queue helps maintain data integrity during recovery
so no data is lost or duplicated.
When you enable recovery for a real-time session that reads from a JMS or
WebSphere MQ source and writes to a JMS or WebSphere MQ target, the
Integration Service writes recovery information to a recovery queue. The
recovery queue stores the reader state, the commit number, and the message ID
that the Integration Service committed to the target. When you recover a session,
the Integration Service uses the recovery information to determine where it
stopped processing.
The Integration Service also writes recovery information to a recovery ignore list for a
failed JMS or WebSphere MQ session. The recovery ignore list stores message IDs that
the Integration Service wrote to the target for the failed session. The Integration Service
writes recovery information to the list if there is a chance that the source did not receive
an acknowledgement. When you recover a session, the Integration Service uses the
recovery ignore list to prevent data duplication.
Partitioning. You can add partitions to a real-time session that includes a JMS
or WebSphere MQ source and is enabled for recovery.
Resilience. Resilience is available if you have the real-time option.

For more information, see the PowerCenter Workflow Administration Guide.

Mapping Architect for Visio (Data Stencil)


Name change. Data Stencil is renamed Mapping Architect for Visio.
Terminology changes. Data Integration stencil is renamed to Informatica
stencil. Data Integration toolbar is renamed to Informatica toolbar.
New mapping objects. You can include a Stored Procedure transformation in a
mapping template or you can create a mapping template from a mapping that
contains a Stored Procedure transformation. You can create a mapping
template from a mapping that contains a mapplet. Or, you can import a mapplet
and add the related mapping objects to the mapping template. Mapplet
properties are read-only.
Shortcuts. You can configure a source definition or a target definition to use a
shortcut. You can create a mapping template from a mapping that contains
shortcuts to sources or targets.
Reusable transformation. You can configure a transformation to be reusable.
You can create a mapping template from a mapping that contains a reusable
transformation.
Multiple pipeline generation. When you import a mapping template in the
Designer, you can create a mapping with more than one pipeline.
Workflow Generation Wizard. You can launch the Workflow Generation
Wizard from the Import Mapping Template Wizard.

For more information, see the PowerCenter Mapping Architect for Visio Guide.

Metadata Manager
Business Name attribute. A metadata object attribute you can use to identify
metadata objects according to their business usage. For more information, see
the Metadata Manager Custom Metadata Integration Guide and the Metadata
Manager User Guide.
Data lineage. You can configure the number of resources that appear when you
launch a data lineage diagram and configure the number of resources that
appear when you navigate the data lineage diagram. For more information, see
the Metadata Manager User Guide.
Export and import custom attributes. Export or import custom attributes and
business name attributes for packaged resource types. You can export custom
attributes for metadata objects in a packaged resource type to an Excel file, edit
the attribute values in the Excel file, and import the attributes into the metadata
catalog. For more information, see the Metadata Manager Custom Metadata
Integration Guide.
Objects Relationship Wizard. Use the Objects Relationship Wizard to create
relationships for multiple custom metadata objects in the metadata catalog. For
more information, see the Metadata Manager User Guide.

Metadata Exchanges

IBM DB2 z/OS. Extract metadata from IBM DB2 z/OS.


Netezza. Extract metadata from Netezza.

For more information, see the Metadata Manager Administrator Guide.

Metadata Manager Service

Oracle service name. You can specify an Oracle service name or SID for a
Metadata manager repository on Oracle.
Oracle RAC. You can configure Oracle RAC for a Metadata Manager
repository.

For more information, see the PowerCenter Administrator Guide.

Reference Table Manager


Reference Table Manager application. A web application used to manage
reference data that is stored in reference tables. Use Reference Table Manager
to create, edit, import, and export reference data. You can also manage user
connections and view user information and audit trail logs. For more information,
see the PowerCenter Reference Table Manager Guide.

Transformations
HTTP. You can let the Integration Service determine the authentication type of
the HTTP server when the HTTP server does not return an authentication type
to the Integration Service. Or, you can specify the authentication type for the
Integration Service to use.
Unstructured Data. The Complex Data transformation is renamed to
Unstructured Data transformation. Complex Data Exchange is renamed to Data
transformation. You can define additional ports and pass output rows to
relational targets from the Unstructured Data transformation. You can create
ports by populating the transformation from a Data Transformation service. You
can pass a dynamic service name to the Unstructured Data transformation with
source data.

For more information, see the PowerCenter Transformation Guide.

Web Services Hub


Processing chunked messages. The Web Services Hub can process chunked
messages from web service clients. For more information, see the PowerCenter
Web Services Provider Guide.

PowerChannel
Windows EM64T support. You can run a PowerChannel Server on Windows
EM64T (64-bit). For more information about PowerChannel, see the
PowerChannel User Guide.

PowerExchange (PowerCenter Connect)

PowerExchange for SAP NetWeaver BI (BW Option)

Name change. PowerExchange for SAP NetWeaver BW Option is renamed


PowerExchange for SAP NetWeaver BI. For more information, see the
PowerExchange for SAP NetWeaver User Guide.

PowerExchange for SAP NetWeaver (mySAP Option)


Name change. PowerExchange for SAP NetWeaver mySAP Option is renamed
PowerExchange for SAP NetWeaver.
Real time sessions with SAP/ALE IDoc Prepare transformations. You can
run a real-time session when the mapping contains an SAP/ALE IDoc Prepare
transformation.
Table type parameters. You can import BAPI/RFC transformations from BAPIs
that contain table type parameters.

For more information, see the PowerExchange for SAP NetWeaver User Guide.

PowerExchange for Web Services

Authentication type configuration. You can let the Integration Service


determine the authentication type of the web service provider when the web
service provider does not return an authentication type to the Integration
Service. Or, you can specify the authentication type for the Integration Service
to use. For more information, see the PowerExchange for Web Services User
Guide.
\

Version 8.5.1 New Features and Enhancements

This section describes new features and enhancements in PowerCenter 8.5.1:

Code Pages

Code Pages

PowerCenter 8.5.1 supports the following code pages:

IBM1159
IBM13121
IBM13124
IBM4933
IBM835
IBM836
IBM837

Version 8.5 New Features and Enhancements

Command Line Programs

infacmd

The infacmd command line program contains commands in the following groups:

Operating system profiles. Added commands to create, list, remove, and


update operating system profiles.
User. Added commands to complete the following tasks:
-Create and remove roles.
-Add, list, and remove privileges for roles.
-Assign roles to a user or group.
-Add user permissions.
- List security domains.
User Management. Added commands to import or export native users and
groups, and to set LDAP connectivity.
Repository Service. Added commands to set up LDAP authentication when
upgrading from PowerCenter version 8.1.1 and earlier to PowerCenter 8.5.
Reporting Service. Added commands to complete the following tasks:
Create, delete, upgrade, back up, and restore contents in a Data Analyzer
-repository.
-Create and update a Reporting Service.
- Upgrade users and groups in a Data Analyzer repository.
Metadata Manager Service. Added commands to create, update, and assign or
unassign an Integration Service.

pmrep

The pmrep command line program includes new commands AssignPermission,


ChangeOwner, and RollBackDeployment.

Data Analyzer (PowerAnalyzer)


Reporting Service. The Reporting Service is an application service that runs
the Data Analyzer application in a PowerCenter domain. You create and enable
a Reporting Service on the Domain page of the PowerCenter Administration
Console. When you enable the Reporting Service, the Administration Console
starts Data Analyzer.
You log in to Data Analyzer to create and run reports on data in a relational database or
to run the following PowerCenter reports: PowerCenter Repository Reports, Data
Analyzer Data Profiling Reports, or Metadata Manager Reports. You can launch Data
Analyzer from the Administration Console, PowerCenter Client tools, or Metadata
Manager, or by accessing the Data Analyzer URL from a browser.
Multiple values for dashboard filters. You can select multiple attribute values
for a dashboard filter.

Data Profiling

You can use a warehouse utility to create, drop, or upgrade a Data Profiling
warehouse. The warehouse utility also creates or upgrades a Data Profiling
warehouse schema and view for Data Profiling reports. The warehouse utility
runs the appropriate script for the destination database based on the command
and arguments you enter.

Data Stencil
Mapping Template Wizard. When you import a mapping template, you can
also choose from pre-defined templates. Pre-defined templates include slowly
changing dimension mappings.
Mapping parameters and variables. Configure additional attributes for
parameters in a mapping template. For example, specify a label or default value
for a parameter.
Datatypes
Bigint. The Integration Service processes 64-bit integer values. The Bigint
transformation datatype has a maximum value of 9,223,372,036,854,775,807
and a precision of 19. You can use the Bigint datatype for generated keys for
Sequence Generator, Lookup, Normalizer, and XML Source Qualifier
transformations.

Domains
Configuration. After you install PowerCenter, use the Configuration Assistant in
the Administration Console to configure services for PowerCenter and Metadata
Manager.
License reports. Generate a license report to monitor number of times you use
a database type as a source or target. You can monitor license usage to
determine if you need additional licenses for upcoming data integration projects.

Integration Service

Date Processing

Subseconds. The Integration Service can process milliseconds, microseconds,


and nanoseconds.
Datetime format. You can specify the default datetime format for Date functions
and port-to-port conversion in the session configuration object. In version 8.5,
the default datetime format specifies precision to the microsecond:
MM/DD/YYYY HH24:MI:SS.US. For upgraded session configuration objects, the
default datetime format specifies precision to the second.
Oracle Timestamp support. The Integration Service reads and writes Oracle
Timestamp values. The Integration Service converts Oracle Timestamp values
to PowerCenter Date/Time values.

Parameters and Variables

Expanded support for parameters and variables. You can enter parameters
and variables in more input fields in the Designer and Workflow Manager. For
example, you can use parameters and variables in source and target table
names, connection object user names and passwords, flat file reader and writer
code pages, stored procedure database connection information, and web
service host end point URLs.
Email variables. You can use email variables to include more information about
a session in post-session email. The variables allow you to include the
Integration Service name, repository user name, session run mode, workflow
name, and workflow run instance name in the email body or subject.
Mapping parameters and variables expansion. The Integration Service can
expand mapping parameters and variables within transformation expressions.
Subsecond support. Date/Time mapping parameters, mapping variables, and
workflow variables support values with precision to the nanosecond.
Get file names for sessions that use a file list. For sessions that read source
data from a file list, you can configure the associated mappings to return the
name of the file from which a row has been read.
Override connection properties. You can override connection properties in a
parameter file using built-in session parameters. You can override connection
properties for application, external loader, FTP, queue, and relational
connection objects.
Get run-time information. Use built-in mapping variables, session
parameters, and workflow variables that allow you to get the
following run-time information:
Information Mapping Session Workflow
Variable Parameter Variable
Folder name X X X
Integration X X X
Service
name
Mapping X X  
name
Repository X X X
Service
name
Repository X X X
user name
Session X X  
name
Session run X X  
mode
Source and   X  
target
number of
affected
rows
Source and   X  
target
number of
applied rows
Source and   X  
target
number of
rejected
rows
Source and X X  
target table
name
Workflow X X X
name
Workflow X X X
run instance
name
Parameterize session parameter file names. You can define the session
parameter file name in a workflow parameter file. When you run multiple
instances of a workflow concurrently, you can define different parameter and
variable values for sessions in the workflow.
Share information among sessions. Sessions within the same workflow or
parent worklet can share information using certain types of parameters and
variables. Before you run a session, you can update mapping parameters,
mapping variables, and session parameters with the values of parent workflow
or worklet variables. At the end of a session, you can update workflow and
worklet variables with mapping parameter, mapping variable, and session
parameter values. This allows one session to pass information to a subsequent
session in the same workflow or worklet.
Share information among worklets. Worklets within the same workflow or
parent worklet can share information using workflow and worklet variables.
Before a worklet runs, you can update worklet variables with the values of
parent workflow or worklet variables. After a worklet runs, you can update
parent workflow and worklet variables with worklet variable values. This allows
one worklet to pass information to a subsequent worklet in the same workflow or
parent worklet.

Pushdown Optimization

Lookup transformation. You can push a Lookup transformation with a lookup


filter to the database with source-side, target-side, or full pushdown
optimization.
Router transformation. You can push a Router transformation to the database
with source-side or full pushdown optimization.
Sequence Generator transformation. You can push a Sequence Generator
transformation to an Oracle or IBM DB2 database with source-side or full
pushdown optimization. The Integration Service creates a temporary sequence
object in the database to push the transformation logic to the database.
Update Strategy transformation. You can push an Update Strategy
transformation to the database with full pushdown optimization.
Slowly changing dimension mappings. You can push Type 1 and Type 3
slowly changing dimension mappings to an Oracle or IBM DB2 database.
Functions. The Integration Service can push the following functions to the
specified databases:
Function IBM Microsoft Oracle Sybase Teradata ODBC
DB2 SQL Server
DECODE X X X* X X X
GREATEST()     X      
IN() X* X* X* X* X* X
LEAST()     X      
SYSTIMESTAMP() X X X X X  
TO_BIGINT() X X X X X  

* The Integration Service was able to push the function to this database in the
previous release.

Real-time Recovery

Improved recovery. The Integration Service writes real-time session recovery


information to a database table. The table helps maintain data integrity during
recovery so no data is lost or duplicated.
When you enable recovery for a real-time session that reads from a JMS or WebSphere
MQ source and writes to a relational target, the Integration Service creates a recovery
table on the target database. The recovery table stores the commit number and the
message ID that the Integration Service committed to the target. When you recover a
session, the Integration Service uses the recovery information to determine where it
stopped processing.
Automatic recovery. You can restart a real-time session without losing data. If
you restart a real-time session that has recovery enabled with a JMS or
WebSphere MQ source, the Integration Service recovers all unprocessed
messages before it continues to process the session.
Operating system flush. When you enable recovery for a real-time session
that reads from a JMS or WebSphere MQ source and writes to a non-relational
target, you can prevent data loss if the Integration Service is not able to write
the recovery information to the recovery file. The Integration Service may fail to
write recovery information in cases of an operating system error, hardware
failure, or file system outage. When you configure this property, the Integration
Service flushes the messages that are in the operating system buffer to the
recovery file.
Stopping real-time sessions. When you stop a real-time session with a JMS,
WebSphere MQ, or changed data source, the Integration Service stops reading
from the source and continues to process all messages it read.
Message order. For a session with JMS or WebSphere MQ sources, the
Integration Service reads messages in the same order as they exist in the
source.

Recovery

Workflow and task recovery. You can start a failed workflow or task that has
recovery enabled without processing the recovery data. If you want to restart a
workflow or task without recovery, start the workflow or task in cold start mode.
You may not want to recover data if you already cleaned up the target system.
Session recovery. You can enable recovery for sessions that have Stored
Procedure and External Procedure transformations. In the transformation
properties, you can configure the transformation to be recoverable.

Session Properties

Custom properties. You can override Integration Service custom properties in


a Session task.

Workflows

Concurrent workflows. You can run multiple instances of a workflow


concurrently. You can run multiple instances of the same workflow name, or you
can define multiple instance names. When you define multiple instance names,
you configure different workflow parameter files for each instance. You can
define different sources, targets, and variables in the parameter files. The
Integration Service can persist separate workflow and worklet variables for each
workflow instance.
When you run multiple instances of the same workflow name, the Integration
Service defines each instance by the run ID. You do not define separate
parameters for each instance and the Integration Service does not persist
variables for them. You might run multiple instances of the same workflow when
the workflow is a web service workflow.

Metadata Manager (SuperGlue)

Architecture

Metadata Manager Service. Runs the Metadata Manager application in a


PowerCenter domain and manages access to metadata in the Metadata
Manager warehouse. You must create and enable a Metadata Manager Service
in the PowerCenter Administration Console to run Metadata Manager.
User interface. The Metadata Manager interface is a browser-based application
that includes the following pages that you can use to perform tasks in Metadata
Manager:
Browse. Browse and search the metadata catalog, create and view shortcuts
and shared folders, run data lineage and where-used analysis, and edit
-metadata objects and object attributes.
Model. Create and edit custom models, add custom attributes to existing and
-custom models, and import and export custom models.
Load. Create and configure resources and load metadata into the Metadata
Manager warehouse. Use the Load page to monitor and schedule resource
loads and purge metadata from the Metadata Manager warehouse. You also
-use the Load page to manage the Metadata Manage search index.
Security. Manage permissions on metadata objects in the Metadata Manager
-repository.
The Metadata Manager user interface replaces the functionality in the Metadata Manager
Console and Metadata Manager application in previous Metadata Manager versions.
Metadata Manager Agent. Required by some resources to load metadata from
metadata sources. You must install the Metadata Manager Agent on the source
machines before you load metadata from Business Objects, Microstrategy, and
Cognos Impromptu. You also use the Metadata Manager Agent to extract
metadata from metadata source files. You can download the Metadata Manager
Agent from the Load page.
Reporting Service. Metadata Manager does not embed Data Analyzer
functionality. Create a Reporting Service to run reports on the Metadata
Manager warehouse.

Browse and Search Metadata

Metadata catalog. Shows resources and metadata objects in the Metadata


Manager warehouse. Use the metadata catalog to browse and edit metadata
objects and run data lineage and where-used analysis.
You must load metadata for a resource into the Metadata Manager warehouse before you
can browse the metadata objects in the catalog.
Search. Includes basic and advanced keyword search in the Browse page. Use
basic search to search all properties for metadata objects. Use advanced
search to limit the object properties and object types that Metadata Manager
searches. You can also save searches to access common searches.
Shortcuts. You can create shortcuts to commonly accessed metadata objects
in the metadata catalog. Shortcuts also include saved searches. In addition, you
can also share folders in shortcuts with other Metadata Manager users.
Tabbed browsing. You can use tabbed browsing to view details about multiple
where-used and data lineage analyses at the same time.
Profiling. You can view profiling information for metadata sources that include
relational metadata. When you load a metadata for a metadata resource, you
can include profiling data for the metadata source. After you load the metadata
for a resource, view the profiling information for the metadata object on the
Browse page.

Edit Metadata Objects

Annotations. Add comments to a metadata object in the metadata catalog. Use


annotations to share comments with other Metadata Manager users.
Relationships. Relationships are associations between metadata objects in the
metadata catalog. You can view existing object relationships between metadata
objects or create relationships between metadata objects in the metadata
catalog.
Supporting documents. Create links to documents on a company intranet,
shared drive, or the internet. Use supporting documents add external
information to a metadata object in the metadata catalog.
Edit attributes. Edit attributes for custom metadata objects and attributes you
added to metadata objects for packaged resources.

Custom Models

You can use the Model page to create custom models to add custom metadata into the
Metadata Manager warehouse. You can create or edit the following types of metadata:

Custom models. Create or edit custom models in the Model page. You can
create a custom model to include metadata in the Metadata Manager
warehouse for which Metadata Manager does not include a packaged resource
type. You can create a model and add custom classes, relationships, and
attributes to define the model.
Custom attributes. Add custom attributes to packaged models.

Data Lineage and Where-Used Analysis

You run data lineage analysis and where-used analysis on the Browse page.
After you run data lineage analysis on an object, you can use the Flash-based
viewer to view the lineage diagram. You can also view the diagram in a separate
browser window. You open multiple data lineage and where-used analyses
simultaneously.

You can access data lineage and where-used analysis for a metadata object
from a shortcut to the object, from saved search queries, from the metadata
catalog, or from where-used analysis or data lineage.

Metadata Exchanges (XConnects)

SAP Metadata Exchange. Extracts metadata from SAP R/3.


Microstrategy Metadata Exchange. Extracts reporting metadata.

User Preferences
Catalog. You can select the resources to display in the metadata catalog.
Browse. You can select the metadata object properties appear when you view
details about a metadata object.

Metadata Resources

Resources represent metadata sources in Metadata Manager. Use the Resource


Creation Wizard to create a resource for a metadata source in the Load page
and configure the resource and configuration properties, parameters, connection
assignments, and schedule for a resource.

You must create a resource in the Load page before you can load metadata for a
metadata source in the Metadata Manager warehouse. You can edit the
configuration for a resource after you run the Resource Wizard by selecting the
resource in the Load page and editing the resource configuration.

Resources also include the following features:

Parallel resource loads. You load multiple resources simultaneously.


mmcmd. Use mmcmd to load, resume, and get the status of a resource load
from the command line or from a script. For example, you can use mmcmd to
run Metadata Exchanges using an external scheduler.
mmwfrundetails. Use mmwfrundetails to get information about the
PowerCenter workflow and sessions.
Schedules. You can create schedules for resource loads and assign the
schedules to resources. When you create a schedule, you configure the name,
description, and the time and frequency. After you create the schedule, you can
assign the schedule to individual metadata resources.

PowerCenter Repository Reports

New Reports

The PowerCenter Repository Reports include the following reports:

Mapping Composite Report. This composite report consists of subreports that


list the sources, targets, and transformations in a mapping. To generate this
report, right click on a mapping in the PowerCenter Mapping Designer and
select View Mapping Report option.
The following table describes the subreports:
Subreport Description
Mapping Object-Level Displays all transformations from source to target in a
Connections mapping and how they are connected to each other.
Mapping Port-Level Displays all transformations from source to target in a
Connections mapping and how they are connected to each other through
ports.
Mapping Source Field Displays column names for all sources by repository, folder,
Details and mapping. It also displays properties of these columns
such as datatype, precision, and length.
Mapping Target Field Displays column names for targets by repository, folder, and
Details mapping. It also displays properties of these columns such
as datatype, length, and precision.
Mapping Displays ports in a transformation by repository, folder, and
Transformation Port mapping. It also displays properties of the ports.
Details
Mapping Displays all unconnected transformation ports from source
Unconnected Ports to target in a mapping.
Mapplet Composite Report. This composite report consists of subreports that
list the sources and transformations in a mapplet. To generate this report, right
click a mapplet in the PowerCenter Mapplet Designer and select View Mapplet
Report.
The following table describes the subreports:
Subreport Description
All Objects Used in Displays all transformations used in a mapplet by repository
a Mapplet and folder.
Mapplet Mapping Displays all the mappings where a particular mapplet is used
Dependency by repository and folder.
Mapplet Port Details Displays all source ports available in each mapplet and port
properties.
Mapplet Sources Displays sources in a mapplet by repository and folder.
Mapplet Source Displays column names for all sources by repository, folder,
Field Details and mapplet. It also displays properties for these columns
such as datatype, length, and precision. This report is the
second node in a workflow associated with the Mapplet List
primary report.
Mapplet Displays transformations used in a mapplet by repository and
Transformations folder.
Mapplet Displays ports and port properties for the transformation used
Transformation Port in a mapplet by repository and folder. This report is the
Details second node in a workflow associated with the Mapplet List
primary report.
Mapplet Displays the default or user-defined properties for
Transformation transformations in a particular mapplet by repository and
Properties folder. This report is the second node in a workflow
associated with the Mapplet List primary report.
Mapplet Lookup Displays all Lookup transformations used in a mapplet by
Transformations folder and repository.
Workflow Composite Report. This composite report consists of subreports
that display workflow tasks, events, and variables. To generate this report, right
click a workflow in the PowerCenter Workflow Designer and select View
Workflow Report.
The following table describes the subreports:
Subreport Description
Workflow Events Displays workflow events and its properties by
repository by folder.
Workflow Task Instance Displays how tasks are connected to each other in a
Link Conditions workflow by repository by folder.
Workflow Task Instance ListDisplays all tasks created in a workflow by repository
by folder.
Workflow Variables Displays workflow variables and their properties by
repository by folder.

Repository Service

The PowerCenter Repository Service includes the following new features and
enhancements:

Exchanging Metadata. You can use the Meta Integration Model Bridge from
Meta Integration Technology, Inc. to exchange data with CA ERwin Data
Modeler 7.x.
Optimizing Repository Schema. For IBM DB2 and Microsoft SQL Server
repositories, you can improve performance by enabling the Repository Service
to create tables using Varchar(2000) columns instead of CLOB columns.

Security

Unified Security Administration

You administer PowerCenter security on the Security page of the Administration


Console. You manage users and groups that can log in to the following PowerCenter
applications:

Administration Console
PowerCenter Client
Metadata Manager
Data Analyzer

Privileges determine the actions that users can perform in PowerCenter


applications. You assign privileges to users and groups for the domain and for
each of the following application services in the domain: Repository Service,
Metadata Manager Service, and Reporting Service.

Roles are collections of privileges. If groups of users perform similar tasks, you
can create and assign roles to grant privileges to the users.

Authentication

The PowerCenter Service Manager uses the following authentication methods to


authenticate users logging in to PowerCenter applications:

Native. You create and manage users and groups on the Security page of the
Administration Console. The Service Manager stores the users and groups in
the domain configuration database.
Lightweight Directory Access Protocol (LDAP). You configure a connection
to an LDAP directory service on the Security page of the Administration
Console. You also manage the users and groups that can log in to PowerCenter
applications, and manage the privileges and roles that determine the actions
that users can perform. The Service Manager imports the users and groups set
up in the LDAP directory service into the domain configuration database.

For both authentication methods, users are located in a security domain. A


security domain is a collection of user accounts and groups in a PowerCenter
domain. When users log in to PowerCenter applications, they select the security
domain for their user account.

Single Sign-On

After you log in to a PowerCenter application, you can launch another application
or access multiple repositories in the PowerCenter Client. You do not need to log
in to the additional application or repository. For example, if you launch a Data
Analyzer report from the PowerCenter Workflow Manager, you do not need to log
in to Data Analyzer.

HTTPS

Domain. When you install PowerCenter, you can create or specify a keystore
file to configure HTTPS. Configure an HTTPS port for the Administration
Console when you install PowerCenter or use the defineDomain,
defineGatwayNode or defineWorkerNode commands.
Specify the HTTPS ports for Metadata Manager and Reporting Service when you create
the services in the Administration Console.
Data Analyzer. When you install PowerCenter, you can create or specify a
keystore file to configure HTTPS. When you create a Reporting Service in the
PowerCenter Administration Console, you specify the HTTPS port for Data
Analyzer.
Web Services Hub. You can send requests to a Web Services Hub through
HTTP or HTTPS or both. You can set the HTTP and HTTPS port on the
Administration Console.

Permissions

Domain. Assign permissions on domain objects to users and groups. When a


group has permission on a domain object, all users belonging to the group
inherit permission on the domain object.
Repository. You can add any user or group to the permissions list for a folder.
You assign folder permissions to each user or group you added. A folder or
global object owner has full permissions on the folder or global object. You
cannot edit an owner’s permissions.
Metadata Manager. Edit the permissions for a metadata object in the Security
page, including the users and groups that have permissions on the object.
Permissions include full control, read, write, and no access.

Privileges

Domain. Assign privileges to users and groups to determine the actions that
users can perform in the Administration Console. You assign privileges to users
and groups for the domain on the Security page of the Administration Console.
Repository. You can assign a new set of privileges to users and groups to
determine the actions that users can perform in PowerCenter Client
applications. You assign privileges to users and groups for a Repository Service
on the Security page of the Administration Console.

Integration Service
Secure FTP. You can configure FTP connection objects to use SFTP when
connecting to SFTP servers. By default, SFTP is disabled.
Operating system profiles. An operating system profile is a level of security
that the Integration Services uses to run workflows. The operating system profile
contains the operating system user name, service process variables, and
environment variables. The Integration Service runs the workflow with the
system permissions of the operating system user and settings defined in the
operating system profile. You can configure the Integration Service to use
operating system profiles if the Integration service runs on UNIX.

Transformations
Complex Data transformation. You can parse streaming data from EDI or
HIPAA files with the Complex Data transformation. The transformation outputs
data in multiple rows.
Lookup filter. You can configure the Lookup transformation to filter the rows
returned from a lookup. You filter the returned rows to increase lookup
performance.
Lookup cache shared across pipelines. If you configure a Lookup
transformation to use a persistent lookup cache and named cache files, you can
share the cache among multiple Lookup transformations. You can configure a
Lookup transformation to cache the lookup table and then configure other
Lookup transformations to share the cache. The first Lookup transformation
uses dynamic cache and the rest of the transformations use static cache. The
Lookup transformations can be in different pipelines.
Pipeline lookup. You can perform lookups on sources other than relational
tables and flat files, such as JMS and MSMQ. When you create a mapping, you
can connect a Lookup transformation to an unconnected Source Qualifier
transformation that connects to a relational, flat file, JMS, or MSMQ source. The
Integration Service reads the source data coming through the Source Qualifier,
caches it, and performs the lookup.
Lookup and Stored Procedure subsecond precision. You can specify
precision up to nanoseconds for Date/Time values in Lookup and Stored
Procedure transformations.
XML Parser transformation. The XML Parser transformation can accept
multiple row input. You can parse large XML files and streaming data from
Informatica PowerExchange for Complex Data by configuring the XML Parser
transformation to receive data from more than one input row. Each input row
consists of a string that contains the XML data and an integer port that indicates
whether the row is the end of the XML stream.

Transformation Language
REG_REPLACE. Use REG_REPLACE to replace a character pattern in a string
with another character pattern.
SYSTIMESTAMP. Use SYSTIMESTAMP to return the current date and time of
the machine hosting the Integration Service with precision to the nanosecond.
TO_BIGINT. Use TO_BIGINT to convert a string or numeric value to a bigint
value.
User-defined functions. User-defined functions appear in the Navigator of the
Repository Manager and the Designer. You can version, view dependencies,
and query user-defined functions in the repository. You can also add user-
defined functions to deployment groups.

Web Services Provider

Web Services Hub

Multiple Web Services Hubs associated with a repository. You can assign
more than one Web Services Hub to a repository. You can run a web service on
more than one Web Services Hub. You can publish a web service to all the Web
Services Hubs associated with a repository or publish the web service to
specific Web Services Hubs.
Multiple web service instances on a Web Services Hub. The Web Services
Hub can run more than one instance of a web service to process requests. You
can set the threshold at which a new instance will be started.
Web Services Hub security mode. You can run the Web Services Hub on
HTTP or HTTPS or both. If you run the Web Services Hub on HTTP and
HTTPS, the Administration Console displays the URL for both modes.
Support of a load balancer. You can use a third-party load balancer to
manage multiple Web Services Hubs and set one logical hub address for all the
Web Services Hubs managed by the load balancer. The logical hub address is
published in the WSDL of the web services that run in the managed Web
Services Hubs.
Web Services Report. On the Administration Console, you can run a report on
the activities of a Web Services Hub and the web services running on the Web
Services Hub. You can view statistics on the requests received by the Web
Services Hub and the average time it took to process messages.
Web Services Hub Console. The Web Services Hub console displays
information about the web services and operations available on the Web
Services Hub in a new format. You can sort the list of web services and
operations and search for specific web services or operations.
Try-It client application. You can use the Try-It application to test an operation
in a web service published on the Web Services Hub console. Provide the
request as a SOAP message or as parameter values and then run the web
service. The Web Services Hub displays the response on the console.
Real-time web services. Sample programs are available that demonstrate how
to create a real-time web service in PowerCenter.
Batch web services operations. New operations are
available in Batch web services to get information on
executed tasks and failed tasks. The Data Integration web
services provide the following new operations:
Operation Description
GetTaskDetailsEx The GetTaskDetailsEx operation is similar to the
getTaskDetails operation, but returns information about
all instances of a task. Use this operation when multiple
instances of a workflow run concurrently.
GetWorkflowDetailsEx This operation is similar to GetWorkflowDetails
operation, but returns information about all instances of a
workflow. Use this operation when multiple instances of
a workflow run concurrently.
StartWorkflowEx The StartWorkflowEx operation returns the run instance
ID for of the workflow. Use the StartWorkflowEx
operation instead of the startWorkflow operation if you
need to know the run ID of the workflow started by the
operation.
New Properties. The Web Services Provider provides the
following new properties for the Web Services Hub Service:
Property Name Description
SessionExpiryPeriod Number of seconds that a session can remain idle before
the session times out and the session ID becomes invalid.
The Web Services Hub resets the start of the timeout
period every time a client application sends a request with
a valid session ID.
HubLogicalAddress URL for the third party load balancer that manages the
Web Services Hub. This URL is published in the WSDL for
all web services that run on a Web Services Hub managed
by the load balancer.
MaxStatsHistory Number of days that PowerCenter keeps statistical
information in the history file. This property determines the
number of days available for which you can display
historical statistics in the Web Services Report page of the
Administration Console.

Source and Target Definitions

Creating sources and targets. You can create web service source and target
definitions without using a WSDL. You can define the columns manually or get
the definition of the columns from existing relational or flat file sources and
targets. You can also import a web service source or target from a WSDL
without creating XML views. You can edit the empty source or target definition in
the WSDL workspace to manually add the XML views.
You can create web service source and target definitions in one process.
Editing sources and targets. If you create a web service source or target
definition from a WSDL, you can add, edit, or delete the XML views of web
service source or target definitions in the WSDL workspace. If you create a web
service source or target definition without a WSDL, you can edit the XML views
in the Designer workspace.
Consolidated target definition. All XML views in a web service target are
contained in one definition, including fault views.

Mappings and Workflows

Creating web service mappings. You can create a web service mapping by
defining the source and target manually or basing the source and target
definitions on existing relational or flat file sources and targets. You can also
create a service mapping based on a reusable transformation or a mapplet.
WSDL. If you create a mapping from a relational or flat file source or target,
reusable transformation, or mapplet, you can configure the Web Services Hub
to generate the WSDL for the mapping.
New workflow properties. The Web Services Provider provides
the following new properties for the web service workflow:
Property Description
Name
Service Time Maximum amount of time the Web Services Hub can take to
Threshold process requests before it starts another instance to process the
(Milliseconds) next request.
Web Services Web Services Hub Service to run the workflow. By default, all
Hubs Web Services Hub Services associated with the repository run
the web service workflow.
Maximum Maximum number of web service instances that can be started in
Run Count a Web Services Hub. All instances of the web service workflow
Per Hub running on the Web Services Hub are included in the count,
whether the instance is started dynamically or manually. The Web
Services Hub cannot start another web service instance once the
maximum is reached.

XML
Convert anyType to string. You can convert an XML element of type anyType
to string by dragging the anyType element into a view. You create a string port
in the datatype field of the XML Editor. You can define length for the string port.

PowerExchange (PowerCenter Connect)

PowerCenter Connect products are renamed to PowerExchange.

PowerExchange for WebSphere MQ

Improved recovery. The Integration Service writes real-time session recovery


information in a database table to maintain data integrity. You can stop and
restart a WebSphere MQ real-time session without losing data. When you
enable recovery for a real-time session that reads from a WebSphere MQ
source and writes to a non-relational target, you can prevent data loss if the
Integration Service is not able to write the recovery information to the file cache.
For more information, see Real-time Recovery.
WebSphere MQ Unicode systems. You can use a UCS-2 code page to
process data from a WebSphere MQ system.

PowerExchange for JMS

Improved recovery. The Integration Service writes real-time session recovery


information in a database table to maintain data integrity. You can stop and
restart a JMS real-time session without losing data. When you enable recovery
for a real-time session that reads from a JMS source and writes to a non-
relational target, you can prevent data loss if the Integration Service is not able
to write the recovery information to the file cache. For more information, see
Real-time Recovery.

PowerExchange for SAP NetWeaver BW Option

BW OHS source definitions. You can import InfoSpokes as metadata for BW


OHS source definitions for BW data extraction.
Starting process chain from PowerCenter or SAP. You can start a process
chain from PowerCenter or SAP to extract data.
Monitoring data extraction from PowerCenter or SAP. You can monitor the
progress of data extraction from PowerCenter or SAP.

PowerExchange for SAP NetWeaver mySAP Option

BAPI/RFC integration includes the following enhancements:

BAPI/RFC transformation. You can import a BAPI/RFC transformation from


BAPIs with input and output parameters.
TYPE type specifier. You can import BAPIs with parameters that use the TYPE
type specifier as BAPI/RFC transformations. The TYPE type specifier can use
data elements, structures, or fields to declare the BAPI parameter.
SAP unicode systems. You can use a UTF-8 code page to process BAPI/RFC
data in a Unicode SAP system.
Decimal and Binary data. Ports in BAPI/RFC transformations can be of the
Decimal or Binary datatype.
High precision. You can enable high precision for sessions with BAPI/RFC
transformations.
Generate BAPI/RFC data by transaction. You can generate a BAPI/RFC
transaction based on the commit interval. The number you enter for the commit
interval is the number of BAPI/RFC in a transaction.
Error output group. The Integration Service writes data for invalid RFC/BAPI
calls or data conversion errors to an error output group in comma-delineated
format.
System variables for BAPI/RFC input data. If the mapping does not provide
an input value for some parameters of a BAPI/RFC transformation, the
Integration Service can use SAP system variables as input values.

PowerExchange for Web Services

Compression. The Integration Service compresses SOAP requests and


responses to increase the speed and efficiency of sending and receiving
messages over the network.
Creating sources and targets. You can create web service source and target
definitions without using a WSDL. You can define the columns manually or get
the definition of the columns from existing relational or flat file sources and
targets.
Editing sources and targets. You can add, edit, or delete the XML views of
web service source or target definitions in the WSDL workspace. If you create a
web service source or target definition without a WSDL, you can edit the XML
views in the Designer workspace.

PowerExchange (PowerCenter Connect)

Performance

Asynchronous Write. You can use Asynchronous Write with Fault Tolerance to
optimize performance. PowerExchange transfers data asynchronously,
increasing speed of transfer, while at the same time providing assurance that
the data is being transferred correctly and safely. The Fault Tolerance
mechanism logs any error produced during transmission.
You can use Asynchronous Write with Fault Tolerance on the following platforms:
-DB2 on MVS
-DB2 on AS/400
-DB2UDB on UNIX or Windows
-Oracle on UNIX or Windows
Continuous PowerExchange Change (CAPX). Runs PowerExchange Change
(CAPX) extractions continuously. The extraction process does not end at end of
the condense file. This allows PowerExchange to read all available condense
files as they arrive. Continuous PowerExchange Change improves performance
by reducing the latency. You can use Continuous PowerExchange Change for
Oracle on UNIX and Windows.
DB2 array processing. You can use DB2 array processing, for example, Array
INSERT or FETCH, to deliver enhanced throughput, greater efficiency, and
better performance when reading from and writing to DB2 V8 in New-Function
Mode on MVS.

Connectivity

DB2 UDB Change Capture partitioned database. You can use partitioned
DB2 UDB data sources.
IMS Write and IMS Lookup. You can write to IMS databases and use IMS as a
PowerExchange Lookup transformations in PowerCenter.
BIGINT and Extended Timestamps. The BIGINT and Extended Timestamp
data types are supported by PowerExchange.

Localization

Multibyte characters. You can use multibyte characters in table and column
names. Multibyte characters allows localization for customers in Asia Pacific.
Multibyte characters in fields allow you to access multibyte table and column
names through Change Capture for Oracle, DB2 UDB and SQL Server on Unix,
Linux, and Windows platforms.

Security

Secure Sockets Layer (SSL) communication. You can configure


PowerExchange to protect data using SSL communication. To ensure that the
data is secure, all machines that are connected to the PowerExchange network
must communicate in a secure manner and must be configured to establish and
accept SSL communication. Establish certificates and keys that authorize the
secure connection between two machines and allow encryption and decryption
of data.
You can configure one of the following modes for SSL communication:
Request client authentication. The default mode is that the PowerExchange
server on your installation authenticates the identity of the client. The server
requests the client certificate and checks that it can be validated against its
-certificate authority list.
-Request server authentication. The client checks the identity of the server.
-Do not request certificate authentication. Neither the client or server requests
certificates or checks that the certificates can be validated against its certificate
authority list. SSL protection is still enabled.
Client and server request certificate authentication. The client checks the
-identity of the server, and that the server checks the identity of the client.

You might also like