What is the difference between dtp and infopackage




















DTP is used to load the data in the following circumstances. Advantages of Data Transfer Process. Delta Management: When you are loading 5 data targets, 4 data targets are successfully loaded but 1 target failed. In this situation, we need to delete all the 5 targets and reload the data in Info Package. In Data Transfer Process, we just need to delete the failed process and to reload the failed only. DTP follows one to one mechanism which is very easier loading the data target. Handling Duplicate Records: DTP handles the duplicity of data in a very easy way by just setting an option Handle Duplicate Record Keys in update tab to get this enabled.

Error Handling: While loading the data into the target using DTP, records with errors will move to one stack and correct one move to the target.

The data was successfully loaded into 4 targets but failed in 1 target. In this case we have to delete the request from all 5 targets and need to re-load the IP to load the data. If we use DTP one to one mechanism , we need to just delete the request from failed targets and re-load to the failed target, which is very easier than loading to all 5 targets once again. Sometimes based on the properties of the fields we have to load this records to the target Info object.

This can be easily achieved by DTP. There are 3 processing modes for background processing of standard DTPs. The parallel process is derived from the main process for each data package.

The main process extracts the data packages sequentially and derives a process that processes the data for each data package. We can change this setting in the execution tab of DTP. The maximum number of background parallel processes we can set for a DTP is While loading the data using DTP, error records will move to the error stack and the correct records will load to the target.

In this way we can also simulate a transformation prior to the actual data transfer if we would like to check whether it provides the desired results. We can define break points for debugging by choosing change break points, which was not available in Info Package.

Cognos Performance Manual Get your free copy of the Cognos book for performance managers. Open navigation menu. Close suggestions Search Search. User Settings. Skip carousel. Carousel Previous.

Carousel Next. What is Scribd? Uploaded by Sree Dhar. The ability to sort out incorrect records in an error stack and to write the data to a buffer after the processing steps of the DTP simplifies error handling.

Transformations simplify the maintenance of rules for cleansing and consolidating data. Instead of two rules transfer rules and update rules , as in the past, only the transformation rules are still needed. You edit the transformation rule in an intuitive graphic user interface.

InfoSources are no longer mandatory; they are optional and are only required for certain functions. Transformations also provide additional functions - such as quantity conversion, performance-optimized reading of master data and DataStore objects - as well as the option to create an end routine or expert routine. You model data flows and elements in the Modeling functional area of the Data Warehousing Workbench.

The graphical user interface here helps you to create top-down model and use the best practice models data flow templates provided by SAP. With top-down modeling, you create a model blueprint on the BW system, which you can use later on to create a persistent data flow. For more information, see Modeling especially the Graphical Modeling section. For more information on migrating existing data flows with 3. The following graphic shows the data flow in the Data Warehouse:. Data Transfer Process: The data transfer process DTP makes the transfer processes in the data warehousing layers more transparent.

Transformation: Transformations simplify the maintenance of rules for cleansing and consolidating data.



0コメント

  • 1000 / 1000