Quantcast
Channel: ABAP Development
Viewing all 948 articles
Browse latest View live

My CDS view self study tutorial - Part 1 how to test odata service generated by CDS view

$
0
0

I am a newbie of CDS view related topic and recently I have to learn it. I will write down here not only the knowledge I learned but also the way how I obtain them via self study ( debugging, or other ABAP tool ). Because it would be quite easy for us to just paste the source code of sample CDS view from other guy's blog and activate it. The CDS view works. But what have you learned from this simple Ctrl+c and Ctrl+v? For me, I always get used to dig a little bit deeper such as "what has happened in the backend after I click the activate button in ABAP development studio? ".

 

In this part, I will introduce how to test the OData service generated based on my CDS view via Chrome extension - postman.

 

Prerequisite

 

I have created two simple CDS views. They are:

 

@AbapCatalog.sqlViewName: 'z20160310'
@AbapCatalog.compiler.compareFilter: true
@AccessControl.authorizationCheck: #CHECK
@EndUserText.label: 'consume view test '
@ObjectModel: {   type: #CONSUMPTION,   compositionRoot,   semanticKey: ['Actor'],   createEnabled,   deleteEnabled,   updateEnabled
}
define view Zjerrytest20160310 as select from Zjerrytest20160309 {    key Zjerrytest20160309.carrid as Jerryid,    key Zjerrytest20160309.carrname as name,    key Zjerrytest20160309.cityfrom as startLocation,    key Zjerrytest20160309.cityto as target,    key Zjerrytest20160309.connid
}

and

@AbapCatalog.sqlViewName: 'zjerrySQL0309'
@AbapCatalog.compiler.compareFilter: true
@AccessControl.authorizationCheck: #CHECK
@EndUserText.label: 'test 233'
@ObjectModel: {   createEnabled,   deleteEnabled,   updateEnabled
}
define view Zjerrytest20160309
as select from spfli association [0..1] to scarr as _scarr
on _scarr.carrid = spfli.carrid {       key spfli.carrid,       key _scarr.carrname,       key spfli.connid,       spfli.cityfrom,       spfli.cityto
}

And create a project in tcode SEGW, import the first CDS view via context menu:

clipboard1.png

After that the project looks like below:

clipboard2.png

Generate runtime artifacts and register the odata service via tcode /IWFND/MAINT_SERVICE. Now the odata service is ready for testing.

 

Metadata test

 

Of course you can use SAP gateway client to test, however I prefer Chrome extension, postman, which can organize all my test cases in a hierarchical structure like below.

clipboard3.png

You have several ways to get the url to test metadata retrieve operation.

 

In tcode SEGW, you can get your service name.

clipboard4.png


Search it in tcode /IWFND/MAINT_SERVICE, click "SAP Gateway Client",

clipboard5.png

Change the url as "/sap/opu/odata/sap/<Your service name>/?$metadata", and then trigger request. You should see 200 return code with status OK.

clipboard6.png

Or you can also use Postman, in this case you have to paste the absolute url with host name and port number, both of which could be found in gateway client response, as marked by the black rectangle above.

clipboard7.png

Read operation test

 

The url I am using is:

https://<host name>:<port number>/sap/opu/odata/sap/ZJERRY20160310TRY_SRV/Zjerrytest20160310

 

The name "Zjerrytest20160310" is the entitySet name which you can find in SEGW.

clipboard8.png

The read operation works, see the part of response data below. But how does the read operation work under the hood?

clipboard9.png

We can ensure that the response we see are fetched from the automatically generated database view when CDS view is activated, and we would like to know which exact line of code in ABAP does this job. As introduced in my blog Six kinds of debugging tips to find the source code where the message is raised  we can get the answer via tcode ST05.

 

Switch on SQL trace in your system via tcode ST05, and then perform the read operation again. Once finished, display the trace result with filter Object Name = "*03*0*". ( Since at this time I am not sure whether data comes from Z20160309 or Z20160310 ).

clipboard10.png

Only one result is found, and click the button to display ABAP code.

clipboard11.png

Then we get what we look for. The line 22 does the read operation.

clipboard12.png

Now we can study the callstack in the debugger to know how our request sent in UI is parsed and handled.

clipboard13.png

Variable lv_sql_statement in line 629 contains the automatically generated SQL statement:

clipboard14.png

SELECT "Zjerrytest20160310"."JERRYID" AS "JERRYID", "Zjerrytest20160310"."NAME" AS "NAME", "Zjerrytest20160310"."STARTLOCATION" AS "STARTLOCATION", "Zjerrytest20160310"."TARGET" AS "TARGET", "Zjerrytest20160310"."CONNID" AS "CONNID" FROM "Z20160310" AS "Zjerrytest20160310" WHERE "Zjerrytest20160310"."MANDT" = '001' WITH PARAMETERS( 'LOCALE' = 'CASE_INSENSITIVE' )

 

The response data in ABAP format could be found in the variable et_flat_data in this callstack frame:

clipboard15.png

clipboard16.png

Filter operation test


The url I am using is:


https://<host name>:<port number>/sap/opu/odata/sap/ZJERRY20160310TRY_SRV/Zjerrytest20160310?$filter=(Jerryid%20eq'LH')

 

It means I want only those records which fulfill the condition "Jerryid = LH" are returned.

clipboard17.png

This time, the automatically generated SQL statement is a little bit different from the one for read operation.

Here the "?" acts as a placeholder for parameter, whose value is specified by another variable in line 29.

clipboard18.png

clipboard19.png

clipboard20.png

Once line 22 is executed, the filter operation works as expected.

clipboard21.png


My CDS view self study tutorial - Part 2 what objects are automatically generated after you activate one CDS view

$
0
0

 

 

You paste the following source code for a simple CDS view into ABAP development studio and activate it:

 

@AbapCatalog.sqlViewName: 'zjerrySQL0208'
@AbapCatalog.compiler.compareFilter: true
@AccessControl.authorizationCheck: #CHECK
@EndUserText.label: 'test 233'
@ObjectModel: {   createEnabled,   deleteEnabled,   updateEnabled
}
define view Zjerrytest20160208
as select from spfli association [0..1] to scarr as _scarr
on _scarr.carrid = spfli.carrid {       key spfli.carrid,       key _scarr.carrname,       key spfli.connid,       spfli.cityfrom,       spfli.cityto
}

And you would like to know what objects are automatically generated during CDS view activation.

 

Automatically generated ABAP objects during CDS view activation

 

You could query table TADIR with following parameters:

clipboard1.png

And get answer:

 

DDLS: Data Definition Language Source

 

STOB: Structured Object

clipboard2.png

The relationship among these objects is listed below:

clipboard3.png

And if you use the same approach described in tutorial part1, you can realize that lots of database tables "DD*" are involved during CDS view activation, for example DDLDEPENDENCY. A small tip here is, if you click "Display Object List" button, you will navigate to the package where other related ABAP artifacts within the same package are displayed as well.

clipboard4.png

clipboard5.png

Now we can go through each database table one by one.

 

Automatically inserted table entries during CDS view activation

 

Several table entries are inserted to the database tables in package SDDL during view activation.

 

DDDDLSRC

 

Query this table by specifying DDLNAME as CDS view name we specified in ABAP development studio, the name after keyword "define view" : Zjerrytest20160208, and we can find view source code stored in field SOURCE.

clipboard6.png

DDDDLSRC02BT

 

Text table which stores the view description specified via annotation @EndUserText.label.

clipboard7.png

DDHEADANNO

 

It stores all header annotation specified in CDS view source code with corresponding value.

clipboard8.png

DDLDEPENDENCY

 

It maintains relationship between the CDS core entity and automatically generated database view.

clipboard9.png

My CDS view self study tutorial - Part 3 how is view source in Eclipse converted to ABAP view in the backend

$
0
0


Let's review what we have learned so far. Through SQL trace and debugging, we have learned below:

 

1. When we click activate button in ABAP development tool ( I call it Eclipse now in this blog ), there are several corresponding ABAP objects generated and table entries inserted. Through debugging we know lots of related logic are implemented in package SDDL.

 

2. When we test read & search function against the odata service created on top of the CDS view, the operation is delegated to CL_SQL_STATEMENT~EXECUTE_QUERY.

 

And now I ask myself, since what we have typed in Eclipse is pure text, how does ABAP backend interpret the text and convert it into ABAP DDIC objects? The only hint we have is the SDDL package. There are more than 10 classes in it. Although an experienced ABAP developer can easily identify which class is responsible for the text->ABAP view conversion just according to the class name and description, whereas ABAP newbies need more time to practice their intuition.

 

clipboard1.png

The tip here is, if you are not sure which class is what you are looking for, then always start with what you have already known. In our case it is CL_SQL_STATEMENT~EXECUTE_QUERY. Set breakpoint on this method and type some more characters in Eclipse, the breakpoint is triggered immediately.

 

From the callstack, we know that every time after you type something in Eclipse, it will fire Syntax check request to ABAP backend.

clipboard2.png


You can observe this via in the bottom part of your Eclipse.

clipboard3.png

From the callstack we find a class CL_DD_DDL_HANDLER which implements interface IF_DD_DDL_HANDLER which seems is what we look for. Open this class and there is a method GET_VIEWDEF_FROM_SRC, which indicates the method converts the SouRCe code to ABAP VIEW-DEFinition. Set breakpoint in it and click activate button again in Eclipse.

clipboard4.png

clipboard5.png

This class will parse source code:

clipboard6.png

clipboard7.png

clipboard8.png

From the code, it is very clear that the parse output, view definition in ABAP format is used to generate ABAP database view.

clipboard9.png

There is also another approach which can allow you to get a clearer view:

 

Open the "ABAP Communication Log" view:

clipboard10.png


Switch on Logging:

clipboard11.png

Then make some dummy change on your view, then you can observe the following six requests sent to ABAP backend.

The highlighted "abapCheckRun" is just the syntax check we already learned.

clipboard12.png

Double click on the request to see request detail. Here the view source code is encoded in Base64 format stored in chkrun:content.

clipboard13.png


We can of course see the original text from some decode-encode online conversion website:

clipboard14.png

If you could like to debug any request you see in ABAP communication log view, please refer to this document: An example to help you understand how does ADT work .

clipboard15.png

When you click activate button in Eclipse, you can know that a syntax check is always performed before the real activation.

clipboard16.png


Unicode check with Code Inspector

$
0
0

Some companies still run non unicode SAP systems. if you plan to convert your system to unicode, you can run UCCHECK regularly on your system, or add the unicode check to the code inspector framework and perform the check e.g. during release of transports or create a list with links to the source code using code inspector / ATC.

 

The attached document shows how to create an own CI test class performing the unicode check.

 

Create checker classes

Base syntax check class

Copy class CL_CI_SOURCE_SYNTAX_CHECK to your own namespace:

1.png

Set Unicode flag to all the report sources before executing the check in the constructor method.

Either fixed:

2.png

Or – optional - by a parameter:

3.png

4.png

 

Code inspector class

Copy class CI_TEST_SYNTAX_CHECK to your namespace.

Adapt copied CI class:

  1. Replace the class names in your new class.
    1. In the local friend definition of the unit test with your new created class.

      5.png
    2. In the class constructor of the test class

      6.png
    3. In attribute C_MY_NAME:
      7.png
    4. In method IF_CI_TEST~QUERY_ATTRIBUTES

      8.png
    5. In attribute REF_CHECK

      9.png
    6. Optional: modifications for Unicode check as using a parameter.
      1. When using Unicode parameter in constructor of base syntax checker class, you can pass the parameter in the GET method:
        Fixed

        10.png
      2. Or create a parameter in your CI class for the UCCHECK
        1. Create attribute

          11.png
        2. Adapt IF_CI_TEST~QUERY_ATTRIBUTES to provide parameter

          12.png

 

 

 

  1. Adapt data transfer methods
    1. PUT_ATTRIBUTES
      13.png
    2. GET_ATTRIBUTES
      14.png
  2. Change the description of the test in the constructor, to find it afterwards in the CI tree.
    (Length of the text symbol as to be adapted)

15.png

Prepare code inspector

Activate your test (class)

16.png

17.png

Create public check variant

18.png

Activate your check and set attribute for code inspector

19.png

 

Example Use check in AT

The created check variant can be used in various scenarios e.g. during transport release.

 

Prepare ATC

Set check variant & activate inform on errors

  20.png

With this settings, the UC check is performed on release.

21.png

22.png

My CDS view self study tutorial - Part 4 how does annotation @OData.publish work

$
0
0

Part1 - how to test odata service generated by CDS view

Part2 - what objects are automatically generate after you activate one CDS view

Part3 - how is view source in Eclipse converted to ABAP view in the backend

Part4 - this blog

 

In part1 of this tutorial, the old way to create OData service on top of CDS view is introduced.

In SAP Help Generate Service Artifacts From a CDS View a new annotation is described:


@OData.publish: true

 

Just add this annotation to your CDS view and the odata service is automatically created, no need for you to go to code SEGW any more.

clipboard1.png

Once activated, the odata service with naming convention "<your CDS view name>_CDS" is available for registration in tcode /IWFND/MAINT_SERVICE:

clipboard2.png

Metadata retrieval test ok:

clipboard3.png

So the question here is: How does this annotation work? How can we research the service generation process through debugging?

 

Here below is how I figure it out.

 

First check what objects have been generated: three additional artifacts are highlighted below.

 

  • IWMO: SAP Gateway Business Suite Enablement - Model
  • IWSV: SAP Gateway Business Suite Enablement - Service
  • CLAS: provider class ZCL_ZJERRYTEST20160311 is generated

clipboard4.png

If I remove the annotation, or change the annotation to @OData.publish: false, and the two are gone:

clipboard5.png

So it means the annotation @OData.publish: true will trigger table entry insertion for type IWMO, IWSV and CLAS during view activation. Then again I switch on ST05 trace and easily find the ABAP code where the table insertion is done.

clipboard6.png

Set breakpoint on the code and observe the callstack in the runtime.

clipboard7.png

The highlighted callstack frames are essential for odata service generation.

clipboard8.png

Let's have a deep look at stack frame 21:

CL_WB_DDLS_SECOBJ_HNDLR_SINGLE->IF_DDIC_WB_DDLS_SECOBJ_HANDLER~ON_ACTIVATION

 

It will first identify which objects must be created based on delta state.

clipboard9.png

For example, if I add @OData.publish: true to an existing CDS view and activate it, the corresponding entry will have flag N ( New ) while other existing annotation has type "U" ( Unchanged ).

clipboard10.png

Inside this method, if annotation ODATA.PUBLISH is found,

clipboard11.png

and the annotation value is true, then it is considered that the odata service must be created.

clipboard14.pngclipboard12.png

clipboard13.png

The odata service creation is implemented in CL_SADL_GTK_ODATA_SERVICE_GEN~CREATE_VIA_EXPOSURE below.

clipboard14.png

Complete callstack:

clipboard15.png

Step by Step Procedure to Find BTE and Implementation

$
0
0

  Hi Friends,

 

  Here am detailing my recent BTE Implementation with Screenshot .

 

  1. To find BTE for a Transaction we have FM ,  BF_FUNCTIONS_FIND  and PC_FUNCTION_FIND.
  2. Set Break-Point in the below line.
  3. 1.png
  4. Go to Required Transaction and give appropriate inputs and Enter or F8 as per the Requirement.
  5. Here is the transaction is F-48 and the Event will trigger while Enter.2.png
  6. Find the Event No through Debugging like below Screen.3.png
  7. Go to Transaction FIBF - >
  8. Environment - > Info System(P/S)
  9. Give the Transaction Event in Screen and Execute.5.png6.png
  10. Click Sample Function Module , it will prompt SE37 Screen.
  11. Copy the Function module and Create as Z  Function module and Activate.9.png
  12. Now (Functional Consult) have to assign the FM to the Standard Event 00001005.
  13. Go to same FIBF - >Settings -> Products ->…Of a Customer11.png
  14. Enter Product (ZBT3) , Text and Check the A and Save . It will Prompt for Customize TR , Create it.
  15. Again Go to FIBF - >Settings -> P/S Modules->…Of a Customer 12.png
  16. Click New Entries - > Give the Event no (00001005) , Product (ZBT3)  and Function Module ZSAMPLE_INTERFACE_00001005. Save in the same Customize TR.
  17. BTE will Trigger after this Customize TR has been moved along with Workbench Request.

 

 

Regards.

 

Palaniyappan.S

My CDS view self study tutorial - Part 5 how to create CDS view which supports navigation in OData service

$
0
0


 

So far we have a working CDS view ready for us to create a UI5 application on top of it via Smart Template in WebIDE within just a couple of minutes. Once done, the UI5 application will display the data from our CDS view like below. For step by step how to achieve this, please refer to this blog: Step by Step to create CDS view through SmartTemplate + WebIDE .


clipboard1.png

How is navigation implemented among CDS views

 

In this part, let's create CDS view which supports node navigation in OData service. The previous CDS view we created has a flat structure which only have a root node. Now let's create a series of CDS views:

 

1. A CDS view which contains two fields: spfli.connid and spfli.carrid. This view acts as the root node of the corresponding OData service model from semantic point of view. This view can support navigation from itself to the defined children node.

 

2. A CDS view which acts as the navigation target from previously defined "root" view. Besides the two fields from sflight.connid and sflight.carrid which correspond to the root view, it has additional new field sflight.fldate.

 

OData navigation means suppose currently I am in the context of spfli.connid = 0001 and spfli.carrid ( data record with yellow ), and through navigation I can get all its dependent data in red color. We will see how this navigation would be performed later.

 

clipboard1.png

3. A CDS view which exposes the two fields connid and carrid from root view and the associated data  from child view.

This view is called "consumption" view and used to published as OData service.

 

Source code of view #1:

 

@AbapCatalog.sqlViewName: 'zspfliroot'
@AbapCatalog.compiler.compareFilter: true
@AccessControl.authorizationCheck: #CHECK
@EndUserText.label: 'root view'
define view Zspfli_Root as  select from spfli
association [0..*] to Zsflight_Child as _Item on $projection.carrid = _Item.carrid                                       and $projection.connid = _Item.connid
{  key spfli.connid,  key spfli.carrid,  @ObjectModel.association.type: #TO_COMPOSITION_CHILD  _Item
}

Source code of view #2:

 

@AbapCatalog.sqlViewName: 'zsflightchild'
@AbapCatalog.compiler.compareFilter: true
@AccessControl.authorizationCheck: #CHECK
@EndUserText.label: 'child_view'
define view Zsflight_Child as select from sflight
association [1..1] to zspfli_root as _root
on $projection.connid = _root.connid
and $projection.carrid = _root.carrid
{   key sflight.carrid,   key sflight.connid,   key sflight.fldate,   @ObjectModel.association.type: [#TO_COMPOSITION_ROOT, #TO_COMPOSITION_PARENT]   _root
}

Source code of view #3:

 

@AbapCatalog.sqlViewName: 'zflight_c'
@AbapCatalog.compiler.compareFilter: true
@AccessControl.authorizationCheck: #CHECK
@EndUserText.label: 'flight consumption view'
@OData.publish: true
@ObjectModel: {   type: #CONSUMPTION,   compositionRoot,   createEnabled,   deleteEnabled,   updateEnabled
}
define view Zflight_Com as select from Zspfli_Root {  key Zspfli_Root.carrid,  key Zspfli_Root.connid,  @ObjectModel.association.type: [#TO_COMPOSITION_CHILD]  Zspfli_Root._Item
}

Activate all of these three CDS views. Since the third consumption view has annotation @OData.publish: true, once activated there will be an OData service automatically generated:

clipboard3.png

How to test navigation

 

First check the response from OData metadata request via url /sap/opu/odata/sap/ZFLIGHT_COM_CDS/$metadata in gateway client.

You should find two AssociationSets generated based on corresponding annotation in CDS views.

clipboard4.png

The entityset Zflight_Com has type Zflight_ComType, which has the navigation Property "to_Item". Now we can test the navigation.

clipboard5.png

First we get the root node's content via url: /sap/opu/odata/sap/ZFLIGHT_COM_CDS/Zflight_Com(connid='0400',carrid='LH') .

clipboard6.png

And in the response, we are told that the correct url for navigation from current node to its child node is just to append the navigation property defined in metadata, toItem, to the end of url, that is, /sap/opu/odata/sap/ZFLIGHT_COM_CDS/Zflight_Com(connid='0400',carrid='LH')/to_Item .

clipboard7.png

How the navigation is implemented in ABAP side

 

Set the breakpoint in the method below and re-trigger the navigation operation.

Check the generated SQL statement in variable statement in line 27.

clipboard8.png

SELECT "Zsflight_Child"."CARRID" AS "CARRID", "Zsflight_Child"."CONNID" AS "CONNID", "Zsflight_Child"."FLDATE" AS "FLDATE" FROM "ZSFLIGHTCHILD" AS "Zsflight_Child"
WHERE "Zsflight_Child"."CARRID" = ? AND "Zsflight_Child"."CONNID" = ? AND "Zsflight_Child"."MANDT" = '001' WITH PARAMETERS( 'LOCALE' = 'CASE_INSENSITIVE' )

 

The value for two placeholders ( ? ) are stored in me->parameters->param_tab:

clipboard9.png


And check response in et_flag_data:

clipboard10.png

clipboard11.png

Open Source ABAP tools - abaplint and abapCov

$
0
0

abaplint

Continuous Integration is a hot topic, but it has always been difficult to do anything for ABAP open source projects, as it would require a dedicated SAP system to run the tests on.

 

So I've started writing abaplint which is a linter written in TypeScript for checking ABAP code. abaplint compiles to javascript, which can be executed on various platforms. It will only do the "Code Inspector" part of checking the code, unit tests will not be run and is not in scope for this project.

 

abaplint will work for objects that have been serialized using abapGit, or files containing the raw abap code.

 

Currently different 17 rules have been implemented, see https://github.com/larshp/abaplint/wiki. The checks can be enabled or disabled plus configured via a abaplint.json file. The project is open source and is work in progress, so expect some problems, but pull requests and suggestions are also welcome.

 

abaplint can be setup to run on Travis CI(see example at https://travis-ci.org/larshp/abapGit), each time a push is done to the ABAP git repository, Travis will download the latest abaplint via npm and run the linter on the committed code.

 

It also works in the Atom editor,

atom.png

 

And in Eclipse(update site to be created),

eclipse.png

 

Plus on the web

web.png

 

abapCov

abapCov takes a step towards visualizing the coverage of the unit tests run on the ABAP server.

 

Run abapCov on the ABAP server, it will run the unit tests with coverage enabled, and the  coverage result is then uploaded to https://codecov.io where anyone can check it out.

 

See example at https://codecov.io/github/larshp/abapOpenChecks

 

 

 

https://github.com/larshp/abaplint or via npm

https://github.com/larshp/abapCov


How to do convenient multicase unit tests with zmockup_loader

$
0
0

Some time ago I published a post about a mockup loading tool we use for unit testing in our projects http://scn.sap.com/community/abap/blog/2015/11/12/unit-testing-mockup-loader-for-abap. Since then me and my team were using it intensively in our developments and also introduced a couple of new features. This post describes our apprach to unit test preparation.

 

Just briefly for those who didn't read the previous post: the idea of the mockup loader is that you prepare your test data in excel, copy it to a bunch of text files (manually or with excel2txt script that is also availble in repo), zip them, upload as binary object via SMW0, and use mockup loader to extract and deploy this data dynamically for your unit test. For more details see the previous post (link above).

 

Use case

Let's suppose we have a method which is designed to identify quality of some product. We input a table of quality inspection parameters and expect some quality class assessment, in the example "A" or "B". Certain parameters are compulsory and must not be skipped during quality inspection: if they are missing the method raises an exception.

 

The goal: write a unit test code once and then easily update it with new test cases prepared in Excel.

 

Data preparation

Let's say we prepared a unit test table with 3 cases (docno = 10, 20 and 30). Note that QUANTITY parameter is missing in the document 30 - exception expected.

 

DOCNOPARAMVALUE
10WEIGHT12,89
10QUANTITY99
10MOISTURE12,6
20WEIGHT14,89
20QUANTITY103
20MOISTURE11,9
30WEIGHT14,89
30QUANTITY
30MOISTURE12,10

 

Another table is the index of test cases. Each refers to a DOCNO, describes expected result and describes the case in a human-readable form (as a kind of documentation and for failure messages). TYPE field describes if the case is positive or negative.

 

TESTIDTYPEDOCNOEXPECTMSG
1+10ACase with quality A
2+20BCase with quality B
3-30Case with a missing Param

 

 

Unit test code

 

1) Define the test case index structure (in the test class)

 

class lcl_test definition for testing inheriting from cl_aunit_assert

  duration short risk level harmless.

  public section.

    types:

        begin of ty_test_case,

          testid    type i,

          type      type char2,

          docno    type char10,

          expect    type char1,

          msg      type string,

        end of ty_test_case.

 

 

    data at_cases type table of ty_test_case.        " << index table

    data o_ml    type ref to zcl_mockup_loader.    " << mockup loader instance

    data o        type ref to zcl_class_under_test.  " << class under test

    ...

 

2) In setup() method acquire a mockup loader instance and load index table

 

  method setup.

    data lo_ex type ref to zcx_mockup_loader_error.

 

    create object o importing ...    " << Object under test initialization

 

    try.

      o_ml = zcl_mockup_loader=>get_instance( ).

 

      o_ml->load_data(

        exporting i_obj      = 'TEST1/index'    " << file path inside zipfile

        importing e_container = me->at_cases ).

 

    catch cx_static_check into lo_ex.

      fail( lo_ex->get_text( ) ).

    endtry.

  endmethod.

 

3) And here is main test cycle. BTW one of new features of ML introduced recently is data filtering - see the call of o_ml->load_data() - it passes the field name and value. If specified, the mockup loader filters the output to meed this condition. It is possible to define more complex filtering also (more details @github homepage).

 

  method test_assess_quality.

    data l_case  type ty_test_case.

    data lo_ex  type cx_root.

    data lt_insp type zcl_class_under_test=>ty_inspections.

    data l_act  type zcl_class_under_test=>ty_quality.   

 

    loop at at_cases into l_case.                      " << Loop through cases

      clear lo_ex.

 

      try.

        o_ml->load_data(

          exporting i_obj      = 'TEST1/inspections'     

                    i_where    = 'DOCNO = ' && l_case-testid

          importing e_container = lt_insp ). " ^^^ Filter only relevant lines

 

        l_act = o->assess_quality( lt_insp ).                " << Test call

 

      catch zcx_mockup_loader_error into lo_ex.

        fail( lo_ex->get_text( ) ). " Mockup load error handling, just in case

      catch zcx_root into lo_ex.

        " do nothing - this is the expected exception for negative tests

        " better use specific exeption classes of course

      endtry.

 

      if l_case-type = '+'. " Positive test

        " msg indicates the failed test id and it's description

        assert_initial( act = lo_ex

                        msg = |[{ l_case-testid }] { l_case-msg }| ).

        assert_equals( act = l_act

                       exp = l_case-expect

                       msg = |[{ l_case-testid }] { l_case-msg }| ).

      else. " '-'          " Negative test

        assert_not_initial( act = lo_ex

                            msg = |[{ l_case-testid }] { l_case-msg }| ).

        " Potentially more specific error check code should follow

      endif.

 

    endloop.

  endmethod.   

 

4) That's it !

 

Now we've got an easy extendable test infrustructure. New test cases for this test can be added without extra coding. Honestly speaking, this works better for integrated tests than for checking small methods. However, it is very useful when applicable - some methods in out developments have unit tests with 20+ cases and they are still updated from time to time with new ones (usually after yet another bug discovery ).

 

Hope you find it useful !

 

The ABAP mockup loader project lives @github, there you can find full reference of the methods and also some use cases in wiki.

 

All the best, Alexander Tsybulsky.

International Editable SALV Day 2016

$
0
0

International Editable SALV Day 2016 – Year Eight


image001.png

Dear CL_SALV_TABLE Fans,

 

Welcome to February 8th, 2016 which is the eighth International Editable SALV Day. See below for a link to a blog I wrote to celebrate this day exactly one year ago:-

 

http://scn.sap.com/community/abap/blog/2015/02/08/international-editable-salv-day-2015

 

As you may know back in the year 2000 we were all encourage to “enjoy” SAP and the good old CL_GUI_ALV_GRID came along to replace function module REUSE_ALV_DISPLAY_GRID.

 

This was nice and object orientated, which we all love. ABAP People love OO programming so much that sixteen years on, some have even started to use it!

 

Anyway, what we liked about the CL_GUI_ALV_GRID was the fact that you could make some of the columns, or even individual cells, editable. This is what the end users were crying out for. SAP itself made a big song and dance about having a convergence between analytical applications and transactional applications.

 

That is, a business user did not just stare at the data and admire it, but could actually take action on the data they were looking at e.g. remove a billing block, or adjust the price.

 

Thus began a golden age. All the peoples of the world forgot their differences and at long last there was world peace, an end to sickness and suffering, and an increase in the average human lifespan to 206 years. As Rufus said “bowling averages were way up, mini-golf scores were way down”.

 

Then, a great disaster befell the universe. With the advent of 7.20 a new monster was created by SAP, named CL_SALV_TABLE and we were all told to bow down and worship this new beast.

 

It fooled us all with its sweet promises of dynamically generating the field catalogue based on an internal table, and we fell for it, writing many reports using this new class, and converting many an old report to use this bright new technology.

 

We laughed and sang and thought how wonderful we were. But as always, pride comes before a fall.

 

One day the business users came to us and said “what we most desire in the world is to have this or that column editable, and maybe a few cells here and there, based on certain logic”. They then stood there looking at us expectantly; after all we had always been able to do it before. Alas, lack – now we could not! We had been cast out of Heaven!

 


image002.png

How could SAP play such a cruel trick on us? Pretending a new technology was superior in every way to the last, and not mentioning a whacking great hole in the middle. This was all the more annoying since CL_SALV_TABLE is nothing more than a wrapper around CL_GUI_ALV_GRID, adding some features, and clearly subtracting others.

 

For many years the sky turned black, and a plague of frogs rained down upon the SAP developer community. Then several programmers took a leaf out of Twisted Sister’s book and declared “we’re not going to take this, we’re not going to take this, and we’re not going to take this, any more”.

 

Here is the article that started the rebellion, written by Naimesh Patel

 

http://zevolving.com/2008/12/salv-table-10-editable-salv-model-overcome-the-restriction-of-salv-model/

 

Since that point many people have joined the crusade, including my good self. There have been many approaches created as to how to get around this, get the benefits of the SALV and also make it editable. Here is the latest blog I wrote on the subject, building on the work of the others who went before me:-

 

http://scn.sap.com/community/abap/blog/2015/08/07/salv-and-pepper-editing-individual-columns-in-the-salv

 

Of course the best way around this problem would be for the powers that be at SAP to recognise that every single one of their customers desires this functionality. I wonder how far SAP would have got if they had said “we have embraced the internet age, with a wonderful suite of new UI5 apps. All of them will be read-only”.

 

This day marks the 8th anniversary of James Hawthorne going cap in hand to SAP and suggesting maybe the SALV could be brought up to parity with the CL_GUI_ALV_GRID and have an option to be editable.

 

http://scn.sap.com/thread/733872

 

SAP did in fact respond to this request. They said it was impossible, and just laughed and laughed and laughed. They are still laughing to this day, fit to burst. No-one in all eternity has ever laughed so loud and so long, with the exception of Jock McLaughing, the Laughing Scotsman, winner of the All Scotland Laughing Competition 2015.

 

We have to face the cold hard truth. It is never going to change. We have proved it is technically possible, not even that difficult, but all the powers that be do is look at our workarounds and then try to close them down.

 

Currently, the way around this problem is to:-

(a)  Use CL_GUI_ALV_GRID

(b)  Use one of the many workarounds you can find on the SCN as to how to make the CL_SALV_TABLE editable, though this is of course naughty

(c)  Take a hybrid approach. Use the CL_SALV_TABLE to generate the field catalogue dynamically, and then pass that field catalogue into CL_GUI_ALV_GRID. There have been some articles on the SCN about that also.

 

Here is an example of such an approach by Łukasz Pęgiel :-

http://scn.sap.com/community/abap/blog/2016/01/10/falv--fast-alv-grid-object

 


image003.jpg

In conclusion, next year I will be publishing a blog celebrating the 9th annual International Editable SALV Day. See you then!

 

Cheersy Cheers

 

Paul

 

 

Example on how to modify BEGDA and/or ENDDA of any PA infotype using the decoupled framework

$
0
0

Introduction

 

I was surprised not to find any examples on how to modify the start and/or end dates of PA infotypes using the decoupled framework. Typically the start and end dates are part of the key identifying the record. Unless you know what you are doing, you will most likely get a CX_HRPA_INVALID_PARAMETER exception with the parameter PRIMARY_RECORD-PSKEY. Important here is to have the existing record, create a new one (with new values) and refer the old record when calling the MODIFY method.

 

Acknowledgements

 

All the code was copied and adapted from the standard function module HR_CONTROL_INFTY_OPERATION, the decoupled framework parts of it.

 

The Code

 

First the definition for method CHANGE_BEGDA_ENDDA

 

  class-methods CHANGE_BEGDA_ENDDA

    importing

      value(IV_TCLAS)type PSPAR-TCLAS default'A'

      value(IV_INFTY)type PRELP-INFTY

      value(IV_SUBTY)type P0001-SUBTY

      value(IV_OBJPS)type P0001-OBJPS

      value(IV_SPRPS)type P0001-SPRPS

      value(IV_SEQNR)type P0001-SEQNR default'000'

      value(IV_PERNR)type P0001-PERNR

      value(IV_BEGDA)type P0001-BEGDA

      value(IV_ENDDA)type P0001-ENDDA

      value(IV_BEGDA_NEW)type P0001-BEGDA

      value(IV_ENDDA_NEW)type P0001-ENDDA

      value(IV_TEST_MODE)type BOOLE_D default' '

    exporting

      value(ET_MESSAGES)type HRPAD_MESSAGE_TAB

      value(EV_OK)type BOOLE_D .

 

 

Then the implementation of method CHANGE_BEGDA_ENDDA

 

method change_begda_endda.

  data lr_message_list          typerefto cl_hrpa_message_list.

  data lr_masterdata_bl         typerefto if_hrpa_masterdata_bl.

  data container                typerefto if_hrpa_infty_container.

  data old_container            typerefto cl_hrpa_infotype_container.

  data new_container            typerefto if_hrpa_infty_container.

  data new_infotype_container   typerefto cl_hrpa_infotype_container.

  data infotype_ref             typereftodata.

  data lv_is_ok                 type boole_d.

  data lv_dummy                 type string.

  data ls_msg                   type symsg.

  data container_tab            type hrpad_infty_container_tab.

  data container_if             type hrpad_infty_container_ref.

  data t777d                    type t777d.

  data lv_has_error             type boole_d.

  data lv_count                 type i.

 

  field-symbols<pshdr>         type pshdr.

  field-symbols<pnnnn>         typeany.

  field-symbols<pskey>         type pskey.

  field-symbols<record>        typeany.

  field-symbols<pxxxx>         typeany.

 

  createobject lr_message_list.

 

  callmethod cl_hrpa_masterdata_factory=>get_business_logic

    importing

      business_logic = lr_masterdata_bl.

 

  check lr_masterdata_bl isbound.

 

  callmethod lr_masterdata_bl->read

    exporting

      tclas          = iv_tclas

      pernr          = iv_pernr

      infty          = iv_infty

      subty          = iv_subty

      objps          = iv_objps

      sprps          = iv_sprps

      begda          = iv_begda

      endda          = iv_endda

      seqnr          = iv_seqnr

      mode           = if_hrpa_masterdata_bl=>exact_matching_record

      no_auth_check  = abap_true

      message_handler = lr_message_list

    importing

      container_tab  = container_tab

      is_ok          = lv_is_ok.

 

  if lv_is_ok eq abap_true.

    describetable container_tab lines lv_count.

 

    if lv_count gt1.

      lv_is_ok = abap_false.

 

      message e016(pg)with'Multiple records found, limit to exactly one'into lv_dummy.

      move-corresponding sy to ls_msg.

 

      callmethod lr_message_list->if_hrpa_message_handler~add_message

        exporting

          message = ls_msg

          cause  = lr_message_list->if_hrpa_message_handler~infotype_generic.

    elseif lv_count eq0.

      lv_is_ok = abap_false.

 

      message e009(pg)with iv_infty into lv_dummy.

      move-corresponding sy to ls_msg.

 

      callmethod lr_message_list->if_hrpa_message_handler~add_message

        exporting

          message = ls_msg

          cause  = lr_message_list->if_hrpa_message_handler~infotype_generic.

    else.

      readtable container_tab into container index1.

    endif.

  endif.

 

  if lv_is_ok = abap_false.

    callmethod lr_message_list->get_abend_list

      importing

        messages = et_messages.

 

    if et_messages isinitial.

      callmethod lr_message_list->get_error_list

        importing

          messages = et_messages.

    endif.

 

    return.

  endif.

 

  old_container ?= container.

 

  try.

      callmethod old_container->if_hrpa_infty_container_data~primary_record_ref

        importing

          pnnnn_ref = infotype_ref.

    catch cx_hrpa_violated_assertion .

  endtry.

 

  assign infotype_ref->* to<pxxxx>.

 

  t777d = cl_hr_t777d=>read( infty = iv_infty ).

 

  createdata infotype_ref type(t777d-ppnnn).

 

  assign infotype_ref->* to<pnnnn> casting like<pxxxx>.

  <pnnnn> = <pxxxx>.

  assign<pnnnn>to<pshdr> casting.

  <pshdr>-infty = iv_infty.

 

  assign infotype_ref->* to<record> casting like<pxxxx>.

  assign component 'PSKEY'ofstructure<record>to<pskey>.

 

  callmethod lr_masterdata_bl->get_infty_container

    exporting

      tclas          = 'A'

      pskey          = <pskey>

      no_auth_check  = abap_true

      message_handler = lr_message_list

    importing

      container      = container_if.

 

  new_infotype_container ?= container_if.

 

  <pskey>-endda = iv_endda_new.

  <pskey>-begda = iv_begda_new.

 

  new_infotype_container ?= new_infotype_container->modify_key(<pskey>).

  new_infotype_container ?= new_infotype_container->modify_primary_record(<record>).

 

  new_container ?= new_infotype_container.

 

  if iv_test_mode eq abap_false.

    callmethod lr_masterdata_bl->modify

      exporting

        old_container  = old_container

        message_handler = lr_message_list

      importing

        is_ok          = lv_is_ok

      changing

        container      = new_container.

 

    if lr_message_list->has_error() = abap_true or

       lr_message_list->has_abend() = abap_true.

      lv_has_error = abap_true.

    else.

      lv_has_error = abap_false.

    endif.

 

    if lv_has_error = abap_true.

      callmethod lr_message_list->get_abend_list

        importing

          messages = et_messages.

 

      if et_messages[] isinitial.

        callmethod lr_message_list->get_error_list

          importing

            messages = et_messages.

      endif.

 

      callmethod lr_masterdata_bl->if_hrpa_buffer_control~initialize.

    else.

      callmethod lr_masterdata_bl->flush

        exporting

          no_commit = ' '.

    endif.

  endif.

 

endmethod.

 

To use it, you would simply call the method with all required parameters

 

  data lt_messages type hrpad_message_tab.

  data lv_ok type boole_d.

 

  callmethod change_begda_endda

    exporting

      iv_tclas    = 'A'

      iv_infty    = '0002'

      iv_subty    = ''

      iv_objps    = ''

      iv_sprps    = ''

      iv_seqnr    = '000'

      iv_pernr    = '12345678'

      iv_begda    = '20001201'

      iv_endda    = '99991231'

      iv_begda_new = '20101201'

      iv_endda_new = '99991231'

      iv_test_mode = ' '

    importing

      et_messages = lt_messages

      ev_ok       = lv_ok.

 

 

This would change the start date of infotype 0002 for employee 12345678 from 12/01/2000 to 12/01/2010.

 

Limitations

 

I'm sure there are many situations where the code will break. For example, I didn't bother with secondary records and the code handles only one record at a time. In addition the code should check that the decoupled framework can actually be used for the infotype in question. Nevertheless, I thought I'd share this as it may help folks out there.

A bug in bapi BAPI_OUTB_DELIVERY_CREATE_SLS (and its solution)?

$
0
0

Hi community,

 

Another blazing fast and short blog post of mine. It seems I have come across a bug in standard bapi BAPI_OUTB_DELIVERY_CREATE_SLS.

 

I was trying to use it to deliver completely a sales order, but I ran into an interesting issue.

 

This BAPI calls function module RV_DELIVERY_CREATE, but, alas, it calls it without a "selektionsdatum", which seems to be what the "due_date" in the BAPI is for.

 

I'm not sure about this, but the fact is, if I populate this field (selektionsdatum) with the value from due_date, it works (for me... in the system I work with...).

 

So what I did was implement an implicit enhancement at the beginning of the BAPI, and I store the value of due_date in the attribute of a global class.

 

Then I created another implicit enhancement implementation at the beginning of RV_DELIVERY_CREATE and get the due_date into selektionsdatum.

 

There ya go.

 

If this is helpful, great. If it's not, I'm sorry.

 

If I'm terribly mistaken about this, let me know.

 

Cheers,

Bruno

My CDS view self study tutorial - Part 6 consume table function in CDS view

$
0
0


Let's try to resolve one real issue now. What we want to achieve is: in CRM we need a CDS view which returns the service order guid together with its Sold-to Party information, "Title" ( Mr. ) and "Name" ( blGMOUTH ).

clipboard1.png

The title and Name information are stored on table BUT000, while Service order transactional information is maintained in table CRMD_PARTNER, which has a field PARTNER_NO ( CHAR32 ) linking to table BUT000's PARTNER_GUID ( RAW16 ).

clipboard2.png

clipboard3.png

It is not allowed to do join on these two fields since their data type are not equal. This question is asked via this SCN thread: ABAP CDS View: join tables on columns of different type .


As suggested in the Correction Answer, this issue could be resolved by using CDS Table Function. Here below are the detail steps.

clipboard5.png

1. Create a new table function

clipboard6.png

@ClientDependent: true
@AccessControl.authorizationCheck: #NOT_REQUIRED
define table function ztf_BP_DETAIL  with parameters @Environment.systemField: #CLIENT                  clnt:abap.clnt  returns { client:s_mandt;            partner_guid:BU_PARTNER_GUID;            partset_guid:CRMT_OBJECT_GUID;            partner_no: CRMT_PARTNER_NO;            bp_guid: BU_PARTNER_GUID;            title:AD_TITLE;            name: BU_NAME1TX;          }  implemented by method    zcl_amdp_bp_detail=>crmd_partner_but000;

With keyword "with parameters", the client parameters is defined which works as the importing parameters for the ABAP class method zcl_amdp_bp_detail=>crmd_partner_but000. The keywords "returns" defines available fields which could be consumed by other CDS entities.

 

For further information about AMDP ( ABAP Managed Database Procedure ), please refer to this document Implement and consume your first ABAP Managed Database Procedure on HANAor this blog An example of AMDP( ABAP Managed Database Procedure ) in 740 .

 

2. Create a new AMDP implementation

Create a new ABAP class zcl_amdp_bp_detail by copying the following source code:

 

CLASS zcl_amdp_bp_detail DEFINITION  PUBLIC  FINAL  CREATE PUBLIC .  PUBLIC SECTION.  INTERFACES if_amdp_marker_hdb.  CLASS-METHODS crmd_partner_but000 FOR TABLE FUNCTION ztf_bp_Detail.  PROTECTED SECTION.  PRIVATE SECTION.
ENDCLASS.
CLASS zcl_amdp_bp_detail IMPLEMENTATION.
METHOD crmd_partner_but000        BY DATABASE FUNCTION FOR HDB        LANGUAGE SQLSCRIPT        OPTIONS READ-ONLY        USING crmd_partner but000.    RETURN SELECT sc.client as client,                  sc.partner_guid as partner_guid,                  sc.guid as partset_guid,                  sc.partner_no as partner_no,                  sp.partner_guid as bp_guid,                  sp.title as title,                  sp.name1_text as name                  FROM crmd_partner AS sc                    INNER JOIN but000 AS sp ON sc.client = sp.client AND                                              sc.partner_no = sp.partner_guid                    WHERE sc.client = :clnt AND                          sc.partner_fct = '00000001'                    ORDER BY sc.client;  ENDMETHOD.
ENDCLASS.

Here in line 30 the two columns of CRMD_PARTNER and BUT000 are joined. The importing parameter is used in SQLScript source code by adding a ":" before the variable name. The hard code "00000001" means the constant value for partner function "Sold-to Party".

clipboard8.png

3. Consume the created table function in CDS view

 

@AbapCatalog.sqlViewName: 'zcpartner'
@AbapCatalog.compiler.compareFilter: true
@AccessControl.authorizationCheck: #CHECK
@EndUserText.label: 'partner detail'
define view Z_c_partner as select from crmd_orderadm_h
inner join crmd_link as _link on crmd_orderadm_h.guid = _link.guid_hi and _link.objtype_hi = '05'  and _link.objtype_set = '07'
inner join ztf_bp_detail( clnt: '001') as _bp on _link.guid_set = _bp.partset_guid
{  key crmd_orderadm_h.guid,  --_link.objtype_hi as header_type,  --_link.objtype_set as item_type,  _bp.bp_guid,  _bp.partner_no,  _bp.name,  case _bp.title    when '0001' then 'Ms.'    when '0002' then 'Mr.'    when '0003' then 'Company'    when '0004' then 'Mr and Mrs'    else 'Unknown'  end as title
}

Please note that the created table function in step1 could be directly consumed just as a normal CDS view, see example in line 8.

Since the table function declares client as parameter, so when consumed, I put the current client id 001 into it. The fields defined as the returning parameters in table function could be used in consuming view.

clipboard9.png

4. Test the whole solution


Click F8 on the view z_c_partner, check whether data previews as expected.

clipboard10.png

or you can also check against single data record, by clicking "SQL Console", maintain the SQL and click Run:

clipboard11.png

The same test could also easily be done on ABAP side:

clipboard12.png

clipboard13.png

ZINCLUDE_ASSEMBLER - develop conveniently, publish as a single file

$
0
0

Hi Community,

 

This post is dedicated to open source lovers.

 

In a couple of open source projects I've seen the following dilema:

 

  • Convenience for users: the code must be easy installable so there is a tendency to keep the whole code (program) as one piece.
  • contra Convenience for the developer: when the code grows it becomes to be difficult to support it as one piece.

 

Of course there are tools like SAPLINK and ZABAPGIT available, which solves this issue (not only this one of course ). However, they must be installed on a target system which puts additional restriction.

 

Recently, I've published at Github a tool that addresses this issue from, let's say, publisher perspective. The tool is called ZINCLUDE_ASSEMBLER and what it does is simply "includes" program's includes which belong to the same dev package into the one piece of code.

 

illustration_small.png

 

So it can be used to publish easy-to-install single-file code while still enjoying nice code structure in dev environment. The result can be saved to a file or to another program (kind of ZXXX_BUILD).

 

The project homepage at Github: https://github.com/sbcgua/abap_include_assembler

 

I hope it will come to use for someone

 

All the best, Alexander Tsybulsky

An ABAP tool to get ABAP source codes line number

$
0
0

You can use this tool ( an ABAP report ) to get the line number of your ABAP source code.

 

How to use this tool

 

Just specify the criteria based on which the source code will be scanned and line number will be calculated.

clipboard12.png


Execute and it will show you detail statistics about line number and the total count. Double click on the ALV list item and you can navigate to method source code.

clipboard13.png

How to get the source code of this tool

 

This tool consists of the following ABAP objects ( as also listed in above picture):

 

  • report ZTOOL_CODE_LINE_COUNT
  • include ZTOOL_DEV_OBJ_SELSCR1
  • class ZCL_TOOL_RS_SERVICE
  • class ZCL_TOOL_SRC_CODE_ANALYZE
  • class ZCL_TOOL_SRC_CODE_LOCATION
  • class ZCL_TOOL_SRC_CODE__ANALYSIS
  • interface ZIF_TOOL_SRC_CODE__ANALYSIS
  • table ZTOOL_METRICS_LO

 

You can get all their source code from this github repository. Feel free to change the source code to fulfill your own requirement.

 

Text Symbols:

clipboard1.png

Selection Texts:

clipboard2.png

clipboard3.png

Table structure

clipboard4.png


Getting Started with ABAP Programming Model for Fiori Apps

$
0
0

This blog provides a collection of information (presentations, blogs, videos, ...) about the ABAP programming model for Fiori Apps which is delivered in the ABAP stack starting from SAP NetWeaver 7.5 SP01.

PS: Updates will be provided regularly.

 

Latest News : 

Currently no news available.

Scenario 1: Develop a Simple List Reporting App

In this introducing scenario, you have the opportunity - starting from an already existing data model - to develop a simple list - reporting scenario based on the standardized ABAP programming model for SAP Fiori. You will be guided step-by-step through the new development infrastructure, which includes technologies such as Core Data Services (CDS), SADL, and SAP Gateway.

Access Tutorial

Watch the video(coming soon)

 

 

Scenario 2: Develop an Advanced List Reporting App with Search and Analytical Capabilities

Starting from the elementary list reporting (see scenario 1), you may want to add some further list reporting functions. For example, if your table or list contains many rows, it becomes difficult for end users to find the information they need. To facilitate finding the desired information, you can provide selection fields (filters) to specify the range of information that the end user is looking for. In addition, you may want to specify the positioning of columns or prioritize, or even hide, specific fields in your table or list.

Access Tutorial

Watch the video(coming soon)

ABAP 740 – Is CONSTANT not a Static Attribute anymore?

$
0
0

Originally published at - ABAP 740 – Is CONSTANT not a Static Attribute anymore?


CLASS_CONSTRUCTOR would be called automatically whenever the class would be accessed – either by creation of an instance or accessing any component. But seems like it is changed with ABAP 740.


Preface

In article CLASS_CONSTRUCTOR and CONSTRUCTOR: Who comes before whom?, we discussed how both constructors are called at runtime. Before continue reading further, I suggest you to visit the memory lane and refresh your ideas on when CLASS_CONSTRUCTOR is called.

 

Thanks Ramakrishna Koliparthi for pointing out this odd behavior. I would not have tried this, based on assumption that it would have worked across all versions.

 

Example 1 in ABAP 740

The same example 1 from article CLASS_CONSTRUCTOR and CONSTRUCTOR: Who comes before whom? generates different output. Comparing both outputs next to each other:

 

ABAP_740_Class_constructor_Ex_1_compare.png

 

What ?


In the earlier version, when constant was accessed the first time, CLASS_CONSTRUCTOR from all the classes involved in the class hierarchy were accessed. But, now in ABAP 740, it doesn’t execute the CLASS_CONSTRUCTOR at all. It doesn’t execute the even the static constructor of that particular class, e.g. LCL_D.


From the keyword help: Constructor


The static constructor is called automatically exactly once per class and internal session before the class is first accessed. An access to the class is the creation of an instance of the class or the addressing of a static component using the class component selector.

But it didn’t trigger as expected!

 

Adding a “Real” Static Attribute

My thought was that there is something wrong with how CLASS_CONSTRUCTOR is being called. I didn’t trust the highlighted statement from the help. So, to see the behavior and to test what help says is true or not, I added one Static Attribute in the class LCL_D and access that.

 

 

 

* New Definition for LCL_D
* ==== LCL_D ===== *
*
CLASS lcl_dDEFINITION INHERITING FROM lcl_c.
 
PUBLIC SECTION.
   
CONSTANTS: c_dTYPE char1 VALUE 'D'.
   
CLASS-DATA: v_dTYPE char1.
   
CLASS-METHODS: class_constructor.
   
METHODS constructor.
ENDCLASS.                    "lcl_d DEFINITION
LOAD-OF-PROGRAM.
 
WRITE: / '... LOAD-OF-PROGRAM ...'.
*
START-OF-SELECTION.
 
SKIP 2.
 
WRITE: / '... START-OF-SELECTION ...'.
 
WRITE: / 'LCL_D=>V_D  ..... ', lcl_d=>v_d.

 

  ABAP_740_Class_constructor_Ex_4.png

 

Now, the output is same as how it was before ABAP 740.

 

Why?

Based on this, and it was working as expected in earlier versions other than ABAP 740, expected behavior was to trigger all the static constructor within the hierarchy, but it did not. So, the question on the subject line – Is Constant NOT a Static Attribute Anymore? Or SAP has changed the way constants are loaded in PXA in ABAP 740. May be all the constants are loaded already and thus no need to load them again, so no CLASS_CONSTRUCTOR call.

 

From help on CONSTANTS

Constants are stored in the PXA and are available to all programs.

What do you think? Let me know by your comment.

 

Also, I'm hopping that Horst Keller or someone from his league might be able to shed some lights on this.

Smartform Version Comparison

$
0
0

There has long been a requirement, where we need to compare 2 Smartform versions.

In SAP there is no feature for this, there are no past versions maintained and also certainly no remote comparison.

 

This makes comparison and version management of smartforms a tedious task.

 

I was faced with such a task too. There were multiple complex smartforms and due to multiple activities and phases of my project working simultaneously, it was very difficult to trace what went wrong and definitely made it a lot of effort to maintain and revert changes in case of any issues.

 

Thus began my journey down the lane on how to overcome this problem.

 

My initial research "Google" shed no fruitful results. All I got was it is not possible in SAP to compare the smartforms and we will need to download the XML files and save locally in order to do version management.

And in case we need to compare, we will need to upload and create a temporary version and then compare each window and element.

 

Well, to me it did not make a lot of sense. Since it would be way easier to create new in a lot of cases and yes, I believe that is what must have been done in most cases.

 

So I started with my own utility program for Smartform Version Comparison.

 

I noticed that always on Download and XML file is generated and this remains more or less of a similar structure.

There are tags which we can traverse and identify to compare the smartforms.

And also the entire data in the smartform is present in the XML.

 

So, why not automate this and voila now I have a working program to compare smartforms.

I have uploaded the source code of the same on Github at https://github.com/sajid-a/SmartformVersionCompare

 

This is currently a working version with a known bug that it can only compare smartforms upto a certain size. The reason is that currently we are not able to process the complete string provided by the XML in the file. Other than that this can be used to compare smartforms.

 

The process for comparison is as follows:

 

1. Download and Maintain the XMLs of the smartform before making any change. This will act as your repository of version management. Alternatively an SVN can be used for this purpose

 

files.PNG

 

2. Execute the program and provide the two XML files that you want to compare.

upload.PNG

3. On execution, the changes in the two versions will be displayed.

 

3.PNG

4.PNG

The program is currently in the Alpha Phase and hence any suggestions or improvements are welcome.

Also all are welcome to collaborate on the source code and help in the development of a comprehensive tool.

 

The features to be added in the current program are as follows:

1. Only Code Comparison

2. Remote System Comparison

3. Automatic Version Management and Comparison

4. Auto generation of XML by just entering the Smartform Name

5. Bug fixes

 

Hope this post proves useful to all who are in need for a way to compare smartforms.

Working with SABRIX when Japanese Entity is Involved

$
0
0

In this Technical Solution I am sharing my experience with SABRIX and doing business with japan Vendors or Customers.

 

Many questions might arise if you do not know about SABRIX. If SABRIX is known to you, then next question will be what could be the issue in doing business with Japan Vendors or Customers.

 

SABRIX is tax calculation third party software. Input is Tax Jurisdiction Code and Ship from and Ship to and other attributes.

 

Issue:

 

  1. Japan has a very specific requirement for Collective Invoicing. For that reason a special Business Functionality was activated. This functionality does not require TXJCD and SAP standard ignores this. For that reason, it was decided that Japan will be going with the Non-Sabrix option similar to Brazil.

 

   2. We have scenarios where other countries do business with Japan(NON-JAPAN to JAPAN). These countries use SABRIX for          tax calculation. As TXJCD is a required field for SABRIX, it gives and error when calculating Taxes.


   3. Affected areas are Billing and Purchasing.


Possible Business Scenario’s

Sales

  1. Non Japan entity shipping to customer in Japan (Non Japan to Japan).
  2. Non Japan entity shipping to customer in Japan from plant in Japan (JAPAN to  JAPAN, SO is owned NON JAPAN).
  3. Non Japan entity shipping to Non JAPAN customer from Japan plant (JAPAN to Non SP )

  Purchase 

  1. Purchase order with non Japanese Vendor and delivery address in Japan      (Non JAPAN to JAPAN)
  2. Purchase order with Japanese Vendor and Delivery address in Japan.(JAPAN to    JAPAN. PO is owned NON JAPAN)
  3. Purchase order with Japanese Vendor and delivery address not in Japan.(JAPAN to  Non JAPAN)

   

Solution in different area’s:


Order

User Exit: USEREXIT_PRICING_PREPARE_TKOMK (Program: MV45AFZZ)


Logic:

  1. Check for TKOMK-BUKRS. If not Japan Company Code, then
  2. Check customer from XVBAK-KUNNR, if customer belongs to Japan.
  3. If both conditions suffice, Update TKOMK-TXJCD ; update ship to and ship from.



  Invoice 


Header Level:

User Exit: USEREXIT_PRICING_PREPARE_TKOMK (Program: RV60AFZZ)

Logic:

  1. Check for TKOMK-BUKRS. If not Japan Company Code,
  2. Check customer from TKOMK-KUNWE, if customer belongs to Japan.
  3. If both conditions suffice, Update TKOMK-TXJCD; update ship to and ship from.



  Item Level

User Exit: USEREXIT_PRICING_PREPARE_TKOMK (Program: RV60AFZZ)

Logic:

  1. Check for  VBRK-BUKRS. If not Japan Company Code, then
  2. Check customer from VBRK-KUNAG, if customer belongs to Japan.
  3. If both conditions suffice, Update XVBRP-TXJCD ; update ship to and ship from.



Purchase

BADI: ME_TAX_FROM_ADDRESS

Logic:

  1. Check if TXJCD is populated or not. If not then below is the logic
  2. Populate country key as the TXJCD from either Vendor, Customer, Address number and Plant.
  3. If Vendor or customer is present then populate their country key.
  4. If Address number is present then populate the country key from the address.
  5. If Plant is present then populate the Plants country.
  6. Population of Country gets prioritized in the following order

  Vendor à Customer àManual Address Number in PO à Plant 

  1. As data is filled in the PO, during creation of an invoice, the same data moves in it as well, populating the TXJCD at the line item level as expected.




Populating  Ship from

  1. Ship from needs to be populated when a Non Japan entity is delivering from Japan plant (Sales Order / Purchase Order).

  Program: ZXFYTU03 

Logic:

  1. Check if Ship From TXJCD is initial.
  2. If yes, check the plant’s country key and populate it.


I hope this will give the basic Idea about the programs where we have to readily look into when using SABRIX.


Regards,

Sagar Puppala.

Sending files from non-SAP to SAP system via Javascript and Webservice (Quick and Dirty)

$
0
0

A year ago, I encountered a challenge working with a third party document repository system that does not have a real RESTful Webservice.

 

The challenge was the third party system uploads PDF files that needs to be attached in SAP via GOS. There is no established interface between an old system to the SAP system but luckily, the old system can still do java script.

 

To make it worse, the SAP system did not have NW Gateway and the customer will not pay for NW Gateway for this requirement. So, implementing official RESTful SDK is not possible

 

So, the solution is to cheat the Webservice to be able to send a file. Another problem is that it cannot send a complex file.

 

So the solution I implemented was:

 

1. I have created a unofficial webservice in the SAP system via SICF and custom handlers. If you are not familiar with this -> view Pseudo-RESTful API in SAP without SAP NW Gateway . The tutorial will show you how to create an unofficial RESTful service.

 

 

2. On the Third Party system side, I created a custom Webservice consumer via Javascript.

 

var postUrl = "http://servername:8000/zrest_2/zfilepost?xblnr="   + entry.xblnr + "&lifnr=" + entry.lifnr + "&sp_id=" + entry.id;     $.ajax({  url: postUrl,  type: "POST",  data: filestring,  success: function(data){  $("#wsSuccess").append(data.Result);     }
});

I am using the standard Jquery API

 

<script type="text/javascript" src="https://ajax.googleapis.com/ajax/libs/jquery/1.11.3/jquery.min.js"></script>

What it does is call the webservice created in SAP.

 

The first two steps are not really too trivial. There are a lot of discussions about this before. The challenge now is how to send a file across that framework and the solution is base64.

 

What is Base64? Base64 is an encoding scheme that converts binary format to radix 64. MIME specification lists Base64 as one of the binary to text scheme. Base64 has its advantages and disadvantages that I will not discuss here.

 

So, moving on with the steps.

 

3. The next step is to convert binary to base64 that will in turn be send via JS in step 2. In Javascript, you need to convert the file to Base64

 

if(this.status == 200){                    var blob = new Blob([this.response],{type:"application/pdf"});                    console.log(blob);                                        reader.readAsDataURL(blob);                    reader.onloadend = function(){                        var result = reader.result;                        console.log(result);                        var bin = convertDataURIToBinary(result);                                             }                                    }
function convertDataURIToBinary(dataURI) {                var BASE64_MARKER = ';base64,';                var base64Index = dataURI.indexOf(BASE64_MARKER) + BASE64_MARKER.length;                var base64 = dataURI.substring(base64Index);                var raw = window.atob(base64);                var rawLength = raw.length;                var array = new Uint8Array(new ArrayBuffer(rawLength));                for(i = 0; i < rawLength; i++) {                  array[i] = raw.charCodeAt(i);                }                return array;              }

The convertDataURIToBinary converts the file to base64 file text file.

 

4. Lastly, we use ABAP FM SSFC_BASE64_DECODE to convert base64 back to binary file using the lv_string as the file converted in the Javascript

 

  lv_string = iv_base64.  SPLIT iv_base64     AT ','     INTO lv_header          lv_string.
*  rv_output = cl_http_utility=>if_http_utility~decode_base64( encoded = lv_string ).
*  CALL FUNCTION 'SSFC_BASE64_DECODE'    EXPORTING      b64data = lv_string    IMPORTING      bindata = rv_output    EXCEPTIONS      OTHERS  = 8.

 

5. Lastly, we attach the file to GOS via SO_OBJECT_INSERT. This has been well documented and you can search a lot of documents to support this.

 

I hope this helps!

Viewing all 948 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>