Quantcast
Channel: SCN : Discussion List - SAP Business Warehouse
Viewing all 6437 articles
Browse latest View live

Data source extraction

$
0
0

Hello Experts,

 

I am extracting data from below datasources to update Master data info objects.

I need to decide on Update mode in Infopackage & DTP Level.

 

Data SourceDelta type
0MAT_PLANT_ATTRNEWE
0APO_LPROD_ATTRAIE
0APO_LOCNO_TEXTNo delta mode enabled

 

Please Explain me on this delta type - what should be my update mode in infopackage & DTP level ?

 

Thanks


Open Hub table data not getting deleted

$
0
0

Data is not getting deleted from open hub table even though Deleting data from table option is ticked in open hub.

There are 2 writes modes available. Does anyone know how to select one from them because they are grayed out.

We tried moving open hub from dev to quality, but it is not working in quality but working in development i.e. data is getting deleted from open hub table in development

how to do calculation based on Navigational attribute

$
0
0

Hi All

 

I have 0MATERIAL_MATRL_GROUP as a navigational attribute in the cube 0COPC_C04 , I want to do some calculation of MAP based on material group , I tried to do at field routine but it doesn't allow me to use 0MATERIAL_MATRL_GROUP  .

 

It would be great if you can help me in this.

 

Regards,

Jagriti Jha

Deletion of Info object from a DSO - with data in target

$
0
0

Hello All,

 

I have specific scenario, where I had recently added a column(Info Object - data fields) in a DSO and moved the changes to quality system; changes are not moved to Production through.

But i no longer need that column and needs to be removed in target system as well. Since the objects are not moved to Production system - I didn't select the option of re-modelling, instead deleted the data in the DSO in development system and removed the object and activated all the corresponding objects(Transformations, DTP's,etc).

 

Issue: In Quality system, post my transports - a few data loads have been loaded, activated and then updated that data to higher level objects(cubes,etc). Hence i deleted the requests, loaded after my previous change in Quality system(by deleting the data loaded to the higher level of flows as well).

 

Question:

1. Will the deletion of the requests loaded after my previous change be sufficient for transporting the deletion object TR from Dev to Qas? or the entire content of the data should be deleted from the DSO?

2. Usually when we go to change mode of a DSO, we will know if a column can be deleted or not by the color highlighted for the info object(Blue-cannot be deleted; Black- can be deleted). Since I dont have authorization to view the DSO in change mode in Quality system, how do i know that the deletion TR can be transported and my transports are not prone to failure.

 

Regards,

Thejas K

DSO to Cube Data Load performance

$
0
0

Hi,

 

I am struggling to identify the issue with regards to data load performance from DSO --> Cube and your inputs will be greatly appreciated.

 

Data Flow : Data Source --> DSO1--> DSO2 --> Cube

 

Transformation between DSO2--> Cube is straight one to one without any Routines. Due to nature of data, Its daily full load.

It’s taking around 10 Mins to complete the data load as part of overnight load…the same cube needs to be refreshed during lunch time and the same load is taking around 2 Hrs during lunch time….we go more that sufficient back ground process available in SM50.

 

Thanks in advance

Object is locked by user ALEREMOTE

$
0
0

Hello Experts,

 

I have an issue regarding DTP load from Datasource to ODS object. It seems that as a lock but i have already checked sm12 and there were no object which locked my process chain. Basically we this started to happens after migrating to SAP BW 7.3. Please let me know to resolve this issue, you can see Error msg just like below. DTP processing mode is Serial Extraction, immediate parallel processing

 

 

Cannot lock data package 5.935.634/000002

Object  is locked by user ALEREMOTE

Exception CX_RS_FOREIGN_LOCK logged

Exception CX_RS_FOREIGN_LOCK logged

Why is Write Optimized DSO is designed as to add records instead of over writing?

$
0
0

Hi BW Gurus,

 

I am aware of the Properties of Write Optimized DSO & Direct Update DSO and also about its usage in BW.

 

My question is, why SAP has designed WO DSO to have the property of only "Adding records" (additive)

instead of overwriting? Definitely there should a specific reason for this property. Kindly enlighten me. Thanks.

 

Regards,

Deepak Anand

How to convert a Write Optimized DSO to standard DSO

$
0
0

Hello All,

 

I have a requirement where in i have to convert a Write Optimized DSO to Standard DSO. I want to know is there any standard SAP delivered program which can help me achieve this?

Do i have to delete all data from Write Optimized DSO and then convert it to Standard DSO?


Update to DataStore Object Data Records is failing (RSODSO_UPDATE, Class 19)

$
0
0

I am trying to load data from a standard DSO (source DSO) into an another standard DSO (target DSO) while looking up data from a write optimized DSO(lookup DSO). In the source DSO I am getting one record per employee and looking up 60 records per employee and appending the result package in the end routine so that I will have 60 records in the target DSO.

 

The below code works fine in BI 7.0 but I am unable to use the same code to load in BI 7.4 SP4. Is there is a new setting I need to set at the DSO or DTP level so that I can load the data into the target DSO, Please advise. I have even generated a sequence per row in the end routine and tried loading it but I am getting same key issue to load the data.

Thanks

 

Appending part of the end routine code:

 

          APPEND <result_fields> TO result_package_t.

 

        ENDLOOP.

      ENDIF.

    ENDLOOP.

 

 

    REFRESH RESULT_PACKAGE.

   RESULT_PACKAGE[] = result_package_t[].

 

 

I have even tried using the appending one record at a time instead of all 60 records into the result package. But still the same key issue to load.

    LOOP AT result_package_t ASSIGNING <result_fields>.

      APPEND <result_fields> TO RESULT_PACKAGE.

    ENDLOOP.

 

Error Message while executing the DTP:

Messages for 1 data records saved; request is red acc. to configuration

 

Message Text

Duplicate data record detected (DS XXXXX , data package: 000001 , data record: 1 )

Message Class

RSODSO_UPDATE

Number 19

 

Long Text

Diagnosis

 

 

    During loading, there was a key violation. You tried to save more than one data record with the same semantic key.

 

 

    The problematic (newly loaded) data record has the following properties:

 

 

    o   DataStore object: XXXXXX

 

 

    o   Request: 467358

 

 

    o   Data package: 000001

 

 

    o   Data record number: 1

Hi Guys.. Need help on interrupt process in Process chains

$
0
0

I have 3 independent process chains as A,B & C. Process chain A and B are running independently. I need to trigger the process chain C after successful completion of Process chains A & B. and process chain C needs to be triggered with the event variants which are included in Process chain A & B.

I came to know that interrupt process can be used in this scenario, but I am not sure how to use it. or is there any other way to trigger the Process chain C after successful completion of A & B process chains.

Very much appreciate for the suggestions. Thanks in advance...

load the data in Real-time infocube in BW 7.4

$
0
0

Dear All,

 

Good Day...!

 

We have a requirement like user wish to enter the values in BW system manually (Source system is FLAT FILE).That should be saved in one Infocube automatically.

 

Kindly please let me know the steps in BW 7.4, how to achieve this requirement.

 

Thanks in Advance,

Venkat

DTP failure - Could not create partition: add_partition_failed

$
0
0

Hi Experts,

 

We have an issue where in the DTP which loads data from PSA to the FI GL cube - FIGL_C10 is failing due to the error--

Could not create partition: add_partition_failed

( ( KEY_0FIGL_C10P >= 0000001191 ) ( KEY_0FIGL_C10P < 0000001192 ) )


My assumption for the error is as below.

1.        This issue is arising I believe since the FIGL_C10 cube has not been compressed since 13.03.2015.

2.        The data from FIGL_C10 is further loaded to cube for Staging Layer cube for Consolidation, the last load to staging cube was done on 13.03.2015.

3.     Due to this data mart setting, the requests from FIGL cube cannot be compressed unless they are loaded to the Staging layer cube and hence this error is occurring.

 

I have gone through a few threads related to this issue but could not find the exact resolution.  Could you please clarify. Appreciate any help I can get.

 

Thanks,

AM

SAP BW Source system Creation Error

$
0
0

Hi Experts,

 

I am having problem in Creating Source System in SAP BW ,It Creates in BW folder please find the Image attached

 

 

SOURCE_SYSTEM.JPG

 

Please , need help

 

Regards

 

Sunil Roy

field level routine in Transformation to achive % baskets

$
0
0


Hi all,

 

I do not know Abap and need to enhance a field level routine code to put derived value from the code in a brackets.

 

here's the existing code -

 

DATA: w_disc TYPE p DECIMALS 0,
          w_diff TYPE p DECIMALS 2.
    w_diff = SOURCE_FIELDS - w_netsales.
    IF ( NOT w_netsales IS INITIAL ) AND
       ( NOT SOURCE_FIELDS-IS INITIAL ).
      w_disc = ( w_diff / SOURCE_FIELDS ) * 100.
      RESULT = w_disc.
          ELSEIF ( w_netsales IS INITIAL ) AND
             ( NOT SOURCE_FIELDS IS INITIAL ).
      RESULT = '100'.
    ELSE.
      RESULT = '00'.
    ENDIF.

 

after the code I get the Discount values as attached image but I need to organise the derived values in a brackets like

 

<0%

0 - 5%

5-10%

10 - 15%

.

.

.

95-100%

 

I do not want to see individual value as show in the attached imange.

 

how do I achive this?

 

regards,

 

BIapp

RSTSODS - Two versions of PSA

$
0
0

There are two versions of PSA for datadsource : 0FI_GL_4 in our production system.  If i try to load the data from this DS, i am getting error message :

Request is in obsolete version of DataSource

 

How to fix this issue? This DS was changed on June 2nd. So i have deleted all requests in PSA till June 1st. Still i am getting the same error.

 

Do i need to transport the DS again ?  Or i can use RSDS_DATASOURCE_ACTIVATE_ALL in production system directly ?

 

If i activate the DS, will RSTSODS will be updated and there will be only one version ?

 

Any help will be appreciated, since this required quite urgently.


Converting FLTP data type data to normal format

$
0
0

i would like to convert a standard key figure (9ADDKF1) in data source   information setup as FLTP to normal number format with decimals , is there a  standard  conversion available if no what options can i use ?

To upload the data in BW System

$
0
0

HI,

 

I have to upload the data from BODS file server to BW system (BIG, HANA) using BODS application. I have populated the data in the BODS staging table now i need to load the same data in BW system. Please help me in this scenario.

COPA Extraction when do we load data from flat file.

$
0
0

Hello Everyone,

In the COPA Extraction when do we load data from the flat file.

 

Thank you very much.

 

Regards,

Kumar

Clear PSA Request

$
0
0

Hello Experts,

 

if Data Target is infocube.

Is it a good practice to delete earlier PSA request & then trigger infopackage to load new request in PSA for every run ?

delete infocube contents before every run & trigger DTP load,irrespective of DTP delta / full update.

 

just to avoid data duplication in cubes.

 

if Data Target is Master data infoobject

This is no possibility for duplication of records in infoobject, so deleting earlier PSA request is not necessary.

 

Please clarify this understanding.

  

Thanx.

Code to correct Invalid date to correct date format.

$
0
0

Hi Experts,

 

I'm loading the data from datasource to DSO and then from DSO to Infocube.

 

While loading the data from datasource to DSO, I was getting error during DSO Activation. The error was "Activation of M records from DSO  terminated" .

After changing the DSO Setting as "No SID Generation", I was able to load it into DSO.

 

So here is the issue m facing--


While loading from DSO to infocube, I got the error "Exception in Substep Write to Fact Table" . Then I found that this error was due to some wrong entry of date in Date column - /OBIZ/LAEDA

Some date values are empty. Some are in correct format- DD.MM.YYYY.

 

But some are in wrong format like Example - 20.2..15.1

 

So could anyone please help me in writing the ABAP Code in start routine of Transformation between DSO and Infocube.

Requesting  ABAP and BW experts to help me out in successfully loading into Infocube.

 

Thanks & Regards,

Shruti Yendigeri

Viewing all 6437 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>