Quantcast
Channel: ABAP Development
Viewing all 948 articles
Browse latest View live

Row level lock using table maintenance views

$
0
0

Hi Guys,

  

while working with own database tables (Z* tables) the SAP offers us a simple tool to add data or edit these tables. I am talking about the table maintenance views, that you can call via the transaction SM30. The problem is that only one user can edit such a table at once. For my application this was unsatisfying because several users worldwide couldn't consider waiting until one has finished his changings. I sersched the SCN and found serveral threads handling this issue. After implementing it myself i wanted to share my knowledge with you and carry the information together.


So at first you need the database table and you have to generate a table maintenance view.  While Generating this View choose two step maintenance type, because we want a list where several entries are not editable. In the overview screen no entry is editable in general, in detail screen it gets editable when opened with changing.

 

When you enter the table maintenance view via SM30 a table level lock is set. This lock has to be removed. The most easiest way to do that is a report that deletes the lock and calls the maintenance view. In detail this is done by reading all system locks. After that we have to delete the relevant lock by calling the function module  ENQUE_DELETE. Now you can call the table maintenance view via the function module   VIEW_MAINTENANCE_CALL.  It is recommendable to add a transaction to this report because it has to be processed to delete the locks. Otherwise if someone opens table maintenance view via SM30 this will not work any more and the lock is set again.

 

Code Example:

 

REPORT ztest.

 

DATA:

BEGIN OF seltab OCCURS 1.        

 

        INCLUDE STRUCTURE vimsellist.

DATA: END OF seltab,

 

BEGIN OF excl_cua_funct OCCURS 1.  

 

        INCLUDE STRUCTURE vimexclfun.

DATA: END OF excl_cua_funct.

 

DATA: lt_enq_del       TYPE STANDARD TABLE OF seqg3,

          lt_enq_read      TYPE STANDARD TABLE OF seqg7,

          lw_enq_read     TYPE seqg7,

          lw_enq_del       TYPE seqg3,

          lv_subrc           TYPE sy-subrc.

 

*Read all locks in system

 

CALL FUNCTION 'ENQUE_READ2'

  EXPORTING

    gclient = sy-mandt

    gname = ' '

    guname = '*'

  TABLES

    enq = lt_enq_read.

 

*We will search entry for table level lock for our table

LOOP AT lt_enq_read INTO lw_enq_read WHERE gname EQ 'RSTABLE'

AND  garg CS 'Z_OURTABLE.

 

  MOVE-CORRESPONDING lw_enq_read TO lw_enq_del.

  APPEND lw_enq_del TO lt_enq_del.

  1. ENDLOOP.

 

*Delete lock entry for our table

CALL FUNCTION 'ENQUE_DELETE'

  EXPORTING

    check_upd_requests = 1

  IMPORTING

    subrc              = lv_subrc

  TABLES

    enq                = lt_enq_del.

 

CALL FUNCTION 'VIEW_MAINTENANCE_CALL'

  EXPORTING

    action                               = 'U'

*   CORR_NUMBER                          = '          '

*   GENERATE_MAINT_TOOL_IF_MISSING       = ' '

*   SHOW_SELECTION_POPUP                 = ' '

    view_name                            =  'Z_OURTABLE'.

        

 

Now every entry we are editing has to be locked for other user. To achieve that we need a lock object configured for our table Z_OURTABLE. Everytime we enter the detail screen we check if this entry locked and if so, we set all fields not editable. For that the Screen logic of the dynpro has to be modified. You can add a module that calls the function module created by your locking object. Be careful , if you change your database table, the screens have to be generated new and also the PBO Module has to be added again.

 

Code example:

 

MODULE change_locking OUTPUT.

 

 

 

  CALL FUNCTION 'ENQUEUE_EZOURTABLELOCKOBJECT

 

     EXPORTING

      kunnr = zourtable-kunnr

     EXCEPTIONS

      foreign_lock   = 1

      system_failure = 2

      OTHERS         = 3.

  IF sy-subrc <> 0.

message 'Data locked by another user!' type 'S'.

* row is locked..

    LOOP AT SCREEN.

      screen-input = 0.

      MODIFY SCREEN.

    ENDLOOP.

  ENDIF.

  1. ENDMODULE. 

 

Everytime a user opens a single record this is locked by the locking object. If any user tries to open the same record all fields are enhanced grey.

 

 

I hope this is useful to someone.

 

kind regards Tobias


Where Are My Transports ?

$
0
0

Hi,


Recently there were some questions about transport information functions.


The program attached Y_R_EITAN_TEST_26_02 is my attempt of using some of the transport functions .

 

Functions:

TR_READ_GLOBAL_INFO_OF_REQUEST
TR_LOG_OVERVIEW_REQUEST
STRF_OPEN_PROT
TRINT_READ_LOG
TRINT_DISPLAY_LOG
TRINT_READ_REQUEST_HEADER

 

The program goes through table E070 based on user selection .

For each E070 row the program call function TR_READ_GLOBAL_INFO_OF_REQUEST , The output from the function is used to create the final output table.


The final output table is presented using cl_salv_table .


Transport Logs and Action log is availble by hotspot .


Screens:

 

screenshot_01.png

 

screenshot_02.png


Enjoy.


Eitan.

 

 

 

 

 

 


Regards.

Software archaeology in SAP code

$
0
0

"REPORT HAS BEEN DEACTIVATED". You may find statement saying that within ABAP code like reports. One may wonder what does it actually mean. Usually the case is that the code is obsolete and it should not be longer used.


What makes SAP code obsolete? These are cases I’m thinking of:


  • Particular function implemented in the code was rewritten (refactored) and an old one is not needed any more and is obsolete.

 

  • Functionality implementation error. Some part of functionality was intended to be present but later it was decided to do not implement it but some initial code remains. Thus the remaining code is obsolete.

 

  • Implementation errors in security area. There is potential security risk or vulnerable patterns with the code. These can be following - just to name few: hardcoded users or passwords, performing certain code without logging it or without checking authorization objects, direct access to critical DB tables, injections issues (ABAP, SQL), RFC execution, directory traversing, using of wait commands, etc. These errors were not detected during security checks of development cycle and were rolled out to customer’s systems. These kind of errors make parts of the code obsolete.

 

As soon as the refactored code is delivered or implementation type of errors were explored a corrections are prepared (mostly in form of SAP Notes). These corrections mark the code as obsolete and prevent execution of it.


softwarearcheological.png


That’s basically what happened in case if we see e.g. REPORT HAS BEEN DEACTIVATED in the ABAP code.


To enforce that particular ABAP code will not be used it is even commented out. The code needs to be comment out instead of simply removing whole objects in order to prevent unnecessary ABAP dump. By this user informed about obsolescence:


rep_is_obsolete_msg.png


Other point of view (as suggested in the comment that introduces deactivation of the function) is software archaeology (see 1st screenshot). Even seems it is practice in SAP that after few more releases particular code is not only commented but also removed it is very nice to see a track of software archaeology :-)

 

 

PS: This blog is cross published on my personal blog site.

ABAP Editor Cherries!

$
0
0

I stumbled across some handy features in the new ABAP editor which may come helpful to a busy ABAP Programmer. Check them out.

 

Bookmarks

Have you found yourself spending a long time scrolling up or down to that particular block of code, (maybe a declaration or function definition), you always need to refer to, in the long program that you have written? Some resort to the find feature on the toolbar to do this.  But here is much quicker method. Set a bookmark on frequently referred lines of code.

In the indicator margin, right click and set bookmark. A blue flag will appear to denote the bookmark.

 

  

 

To navigate to a book mark, right click on indicator margin and select ‘Go to Bookmark’.

 

 

You can set upto 10 bookmarks in a program.

 

Window Splitter


Even better – Do you want to work on/ view different parts of the code simultaneously? Double click the button at the right top corner on the vertical scroll bar.

 

 

 

Toggling Caps Lock and Number Lock.

Double click on the marked area in the right bottom corner to toggle the Caps Lock or Num Lock on the keys.

 

Change the default font and color coding .


 

Click on the editor options at the right bottom corner of the window.

 

You can provide your own color coding to different sections of code or change the font type and size to increase readability. The default option would be most preferred, but you do have a choice if you want a change!

The case of the missing cl_salv_form_layout_grid

$
0
0

Hi,


Recently there was a request to display more then one cl_salv_table on screen with headers.

Not able to display Header Details using cl_salv_table class by Idris Ahmed Khan


Display multiply cl_salv_table on screen ? Doing that is not complicated (e.g. Using cl_gui_splitter_container) .

 

Here is a sample:

screenshot_02.png


But the requirement was also to present some header info using cl_salv_form_layout_grid.


Here is a sample:


screenshot_03.png


I tried to create the required output but with no luck.


Then I saw this Adding header to alv(factory) in container that led me to believe that I am fighting a lost battle.


Well..... Time for some workaround.

 

Program Y_R_EITAN_TEST_08_23 is using cl_gui_splitter_container that is made up of 4 rows.


Row 1 and 3 are filled using cl_gui_html_viewer and are used as headers (replacement to cl_salv_form_layout_data_grid) .


Html allows formatting and colors to make any user happy....

 

You can adjust the initial height of the splitter rows using cl_gui_splitter_container->set_row_height .

 

Row 2 and 4 are filled using cl_salv_table .


And here is the result:

screenshot_01.png

 

Enjoy.

 

p.s.

 

The program use a status named STATUS_COMMON

 

screenshot_04.png

 

All the functions are "Exit Command"

 

screenshot_05.png

Automatically Generate Data Declarations While Inserting FM or Method Call

$
0
0

Summary

There is a very important time-saving ABAP Editor setting that enables data declaration of actual parameters when calling a function module (FM) or class method. The setting should work fine on ABAP release 731 and above.

After enabling the setting, call pattern for FM would look like this:

 

 

Call pattern of class method would look like this:

 

 

The settings can be found in ABAP Editor > Utilities > Settings > ABAP Editor > Editor > Pattern

 

 

 

Story

I had seen the setting about 1 year ago on 701 release but the meaning was not so obvious. Enabling the setting did nothing on that release.

The text ( Name Actual Parameter Same as Name Formal Param. ) does not suggest that the actual parameters will be declared automatically.

F1 help or verbatim Google search returned 0 hits.

 

This 4 years old blog explains how to insert custom pattern (FM and data declarations) using ABAP Editor exit.

Call a function module in the ABAP Editor: Stop Crying - Start Laughing.

This ready-made solution was luckily present in my work environment. Excited by the thought of saving so much time, I started using the editor exit.

 

My work environment changed some time back ( different system having 731 release ) and the custom pattern was no longer with me.

I missed the functionality but didn't install the code due to:

  1. Laziness - It is a selective thing. When it comes to touch typing, I am not lazy at all.
  2. Reluctance - Basis/Security might question me if dumps came due to that exit.

 

Couple of days ago I finally installed the editor exit. Nowadays I am trying to use methods instead of forms. As an exercise, I thought of understanding how the code works so that some day I'll replace all forms with methods and probably improve the code using newer ABAP features.

 

The custom code reads FM's formal parameters, determines the types, generates code for data declaration and then generates code for FM call. I thought to see how the standard generates FM call and reuse that code if possible. While debugging standard Call Pattern > Call Function, I found that there is an FM to generate the FM call. After generating the FM call code, there was additional code enclosed in IF condition. It really looked like the code is present for data declaration but it is not getting called.

 

 

The IF condition was on field of table RSEUMOD, which stores user specific ABAP workbench settings. This meant the setting can be activated somewhere in ABAP Editor settings. Added bonus was that similar setting is present for class methods too, which my custom editor exit didn't do. I tried out same thing on 701 release ABAP trial and found that even though the setting is present in options, there is no corresponding code to do something with it.

 

This is how I came to know about the setting while debugging standard code.

 

Some Notes

  1. I am also using the option " Functional Writing Style for Call METHOD " . Functional-style method calls help reduce the lines of code and can be used in functional expressions. For example, try creating pattern for get_instance( ) method of some class. Generated code would have actual parameter on left hand side and functional method call on right hand side.
  2. You can export the ABAP Workbench settings to PC so that it can be imported by your colleagues.
  3. The custom pattern uses magic word *$&$MUSTER. Muster is German for Pattern.

Creating test data directory(variants in SE37) by programming

$
0
0

Recently there was an issue in ECC while updating records to infotypes.  Third party sends data to PI and PI in return calls ECC via a  RFC enabled FM. I was not sure whether data received by ECC was right and I had to depend on PI to get the payload and simulate it in ECC.

 

In this blog, I am going to explain how to create test data directory programmatically so that you dont need to depend on other middlewares for your analysis.

 

1. Find the RFC user which will call your FM/BAPI. Usually maintained by basis in SM59

2. Request your security team to add parameter id FBGENDAT  as X in user master (Transaction SU01)

3. Execute report FBGENDAT.

4. Enter the name of the FM/BAPI which you wish to create test data directory and execute the report. The options used are self explanatory and is available in report. Option C is meant to debug using process(Transaction SM50)

5. In your FM/BAPI, write below lines of code. Certain SAP Standard FM/BAPI already have below code in them. Example BAPI_PO_CHANGE.

 

INCLUDE FBGENMAC.

SET EXTENDED CHECK OFF.

   fbgenmac '<<YOUR_FM_NAME>>'.

SET EXTENDED CHECK ON.



Request your thirdparty or middle ware to send data to SAP.


1. Goto SE37, enter the FM/BAPI and press execute.

2. Click on test data directory button on the tool bar


You will observe that new enty will be present with date, time and user. You can use this to analyse your FM/BAPI.



Once your are done with your analysis, you can execute report FBGENDAT and delete the required entries. This will deactivate creation of test data directories for future requests.



References:


How to Create a Test data to BAPI_PO_CREATE(1) and BAPI_PO_CHANGE - Supplier Relationship Management - SCN Wiki

517767 - Generate test data for function modules

539978 - Automatic generation of BAPI test data directory

Basic Smart form "source" scan

$
0
0

Hi,

 

Some times there is a need to do a source scan of smart forms .

 

The recent needs that triggers this blog entry is here: How to remove hardcoded values in smartforms

 

Program Y_R_EITAN_TEST_31_08 (attached) is my attempt .

 

The program goes through the selected smart forms based on table stxfadm ,

 

For each form it extract xml document representing the form .

 

The XML document is iterated and each node is checked against the selection screen parameters.

 

The filtered nodes are presented using cl_salv_table .

 

(I did said it is basic....)

 

screenshot_01.png

 

screenshot_02.png

 

Enjoy.


Empower your development

$
0
0

Hi ABAP friends,

 

after reading the BlogABAP Editor Cherries! i had the idea of giving you some tools that might power up your applications or will help you writting performant and

high-quality

ABAP code .

 

Code Inspector

 

The Code Inspector can be configured/started via the transaction SCI/SCII. It is a powerful tool to provide checks for your ABAP applications. For example a company wide development guide line can be customized. If you define all global variables begin with gf (global field) all deviations will be pointed out.

 

It's duties are:

  • check development guide lines
  • advanced programm check
  • perfomrance & security checks

 

 

Also it can be started directly from the ABAP Editor to be runned with the default variant.

 

Screenshot_codeIns.png

 

Screenshot_codeIns2.png

 

Running the Code Inspector definitely improves your coding, and helps you standardizing your programms which will be blessing if someone tries to  comprehend your thoughts.

 

I think this tutorial SAP Code Inspector (SCI) – Tutorial is quite useful.

 

Coverage Analyzer

 

You can call the Coverage Analyzer via the transaction SCOV. This tool can be used to check if single code areas get called or see how often they have been processed. Also you can evaluate how often programm, function modules oder classes have been called. It can be used for example to find dead code or see running time for single code areas to improve specifial sections.

 

It has to be started first and starts recording from this point. If you record all programm calls system wide, be aware of the performance loss which might not be huge but noticeable.

 

Screenshot_coverAnaly.png

Screenshot_coverAnaly2.png

 

In the screenshot above the all methods, forms and modules and their percentage of processing are displayed. Also the active calls in number.

 

Workload Monitor

 

The workload monitor is a very powerful tool to analyse the workload of several resources or your whole system in detail. >ou can compare calls and runtime (CPU & DB) of transactions and much more.

 

Runtime Analysis

 

Duties of the Runtime Analysis:

  • measure running time
  • evaluate Measurements
  • examples to compare techniques in ABAP

 

You can start the runtime analysis via the T-code SAT. After that you have the fill in several parameters and a programm/transaction or a funtion module you want to analyse. The result is an detailed list of inormation about your selection. You can now start analysing and optimzing your applications. One thinkable scenario is finding extremely long running reports and optimize them by little modifications (using hashed tables instead of standard table for example).

 

Screenshot_SAT.png

 

Splitscreen Editor

 

The last of the tools i want to introduce to you is the not very common SplitScreen Editor (SE39) that can be used to compare Sourcecodes either of function modules reports or classes with each other.This might be useful if you want to compare or copy code or functionality to your new apllication.

 

Screenshot_Splitscreen.png

 

I hope i could show use some standard transactions that might be useful for you and that help you improve your devolopment skillz to build qualitive better applications and make development a little bit more comfortable to you.

 

My personal experience was that it some of the tools above were very useful for my company and all the devolopers in many cases.

 

Kind regards Tobi

Understanding CSV files and their handling in ABAP

$
0
0

In many ABAP developments, we use CSV files and sometimes there is confusion regarding CSV itself. Is it just a text file with values separated by commas ?

 

Let's look at semantics of the various components involved so we have a vocabulary to work with.

 

Separator: Demarcates between two fields - so it will be a comma ',' for CSVs.

 

Delimiter: It signifies limits of things, where it begins and ends. e.g. "Test String" has two delimiters, both double quote characters. Many CSVs can have double quotes as delimiters when comma values are to be placed as text.

 

Terminator : Indicates end of sequence. For CSV, we can think of newline as the terminator.

 

So if we have confusion about whether commas or double quotes are allowed inside data, looking at the CSV specification:

 

The de facto standard for CSV is here in case you want to read the full standard.

 

http://tools.ietf.org/html/rfc4180

 

Definition of the CSV Format

 

  1.  Each record is located on a separate line, delimited by a line break.

 

  2. The last record in the file may or may not have an ending line break.

 

  3.  There maybe an optional header line appearing as the first line of the file with the same format as normal record lines.

 

  4.  Within the header and each record, there may be one or more fields, separated by commas.  Each line should contain the same

      number of fields throughout the file.  Spaces are considered part of a field and should not be ignored.  The last field in the

      record must not be followed by a comma.

  

  5.  Each field may or may not be enclosed in double quotes (however some programs, such as Microsoft Excel, do not use double quotes

      at all).  If fields are not enclosed with double quotes, then double quotes may not appear inside the fields.

  

  6.  Fields containing line breaks (CRLF), double quotes, and commas should be enclosed in double-quotes.

 

  7. If double-quotes are used to enclose fields, then a double-quote appearing inside a field must be escaped by preceding it with

       another double quote.

 

In my experience, point 7 is where we get tripped the most. CSV stands as comma separated values leading to the impression that commas are the separator and given that excel doesn't put commas, it can start to get confusing.


 

So looking at some examples

 

Basic Example:

 

10, Vikas , Sydney

 

Data with separator / delimiter inside them.

 

"11", "Vikas", "Sydney, AU"      <-- Data containing comma

 

"12", "Vikas", "Sydney, "NSW" AU"   <-- Data containing comma and quotes in data

 

Handling in ABAP:

 

I'm focusing on reading the files as that's where we face issues. The file can be uploaded from user's desktop or read from the application server.

 

1)  Write your own code:

 

This can be easiest to start with but can start to get complicated with time.

 

Get data as a string, split at comma.

 

   split lv_data at ',' into lw_struct-test1 lw_struct-test2 lw_struct-test3.

 

Drawbacks:

 

a) This won't work if we have data with separator, terminator or delimiter ( so no commas, double-quotes or newline within data ).

 

b)  The code will need to be updated if the file format changes - say we need to add another field test4. The code then changes to :

 

   split lv_data at ',' into lw_struct-test1 lw_struct-test2 lw_struct-test3 lw_struct-test4.

 

 

2) Read the file using KCD_CSV_FILE_TO_INTERN_CONVERT

 

CALL FUNCTION 'KCD_CSV_FILE_TO_INTERN_CONVERT'

  EXPORTING

    i_filename      = 'C:\Temp\Upload.csv'

    i_separator     = ','

  TABLES

    e_intern        = gt_intern

  EXCEPTIONS

    upload_csv      = 1

    upload_filetype = 2.

 

Drawbacks

 

a) The file can be read only from presentation server/ desktop.

 

b) If a CSV file exists with double quotes, the last field is left with double quotes.

 

c) In case the file is to be read from application server, we need to read the code inside this FM and write some custom logic.

 

3) Use RTTI and dynamic programming along with FM RSDS_CONVERT_CSV .


It works but has lots of code . You can have a look at the code in this GIST.

 

CSV_Upload_long_process

 

In summary the steps are :

 

- Get structre of destination table using RTTI

- Create field catalog

- Create a dynamic table for field catalog values

- Create dynamic table lines

- Process Raw CSV data

- Store CSV files into dynamic table

 

Drawback:

 

a) Relatively long code leading to results especially if you have to program it from scratch.

 

Advantage:

 

a) Code is free from the target table format. If a new field is to be added, just update the structure for table type z_data_tty

 

4) Use class CL_RSDA_CSV_CONVERTER .

 

So the call becomes very straight forward - Instantiate the class with the separator and delimiter values. For a normal CSV, leave them as default .

 

* Instantiate the CSV object

 

  call method cl_rsda_csv_converter=>create

*  EXPORTING

*    i_delimiter = C_DEFAULT_DELIMITER

*    i_separator = C_DEFAULT_SEPARATOR

    receiving

      r_r_conv    = lo_csv

 

* Process records

  loop at lt_upload_data into lv_data.

 

    CALL METHOD lo_csv->csv_to_structure

      EXPORTING

        i_data   = lv_data

      IMPORTING

        e_s_data = lw_struct.

 

 

That's It !

 

Advantages:

 

a) The code is very small - less chances of us making an error compared to the one in 3) above.

 

b) The code is decoupled with file structure - we get to keep the benefit from the above point .

 

c) It can be used for both application server / presentation server files - of course file reading will need to be done before the call.

 

d) The developer has documented the examples exhaustively in method CSV_TO_STRUCTURE . Big thank to him/her !

 

e) It's part of package RSDA which is present in ABAP trial environments as well such as NSP .

 

 

If you feel lazy to type the full program, here is the whole source code:

 

CSV_Reading_option_4

Back to basics - Screen Flow and Screen Sequences .

$
0
0

I was a little bit hesitant if there is a place for such a post but then I decided lets the public be the Judge....


Some time we have to create a program that involve multiple screens .

 

Program Y_R_EITAN_TEST_08_21 (attached) is a sample program for demonstration purpose .

 

The program navigation is done by using statements "SET SCREEN dynnr","LEAVE TO SCREEN dynnr" and "LEAVE TO SCREEN 0"
(There are other options please refer to the online help) .

 

The program will start with "selection screen" and then screen 0100 will be called using "CALL SCREEN 100" .

 

The program use two levels of "screen sequence":
- 0100,0200,0300,0400 .
- 1100,1200,1300,1400 .


Screens 0100,0200,0300,0400 navigation is done using "SET SCREEN dynnr" and "LEAVE TO SCREEN dynnr" .


Screen 0100 will use "LEAVE TO SCREEN 0" to terminate the "screen sequence" and return to the "selection screen" .

 

From screens 0100,0200,0300,0400 we can (Using "Function code") "go down" to screen 1100 so "CALL SCREEN 1100" is used
(a new "screen sequence" is started )

 

Screens 1100,1200,1300,1400 navigation is done using "SET SCREEN dynnr" and "LEAVE TO SCREEN dynnr" .

 

From screens 1100,1200,1300,1400 we can "go back" to the caller of this "screen sequence" so "LEAVE TO SCREEN 0" is used .

 

Screens:

 

screenshot_01.png

 

screenshot_02.png

 

screenshot_03.png

 

screenshot_04.png

 

 

screenshot_05.png

screenshot_06.png

Happy navigation .

 

Due to the site limitation:

- Screens 0200,0300,0400 are a copy of screen 0100
- Screens 1200,1300,1400 are a copy of screen 1100

 

The user interface:

Title text
TITLE_COMMON&1
Status
STATUS_COMMONSTATUS_COMMON
Function CodeFunctional TypeFunction TextIcon NameIcon TextInfo TextFast Path
BACKEBack
CANCELECancel
EXITEExit
LWR_LEVELELower level screensICON_NEXT_HIERARCHY_LEVEL

Search Hard Codes In Smart Form And Smart Styles Used In The Form

$
0
0

1.PNG

 

Scanning program is used to scan the overall smart forms and display the hard coded values used in windows and initialization and teaches you how to find the nodes exactly where it is placed when you know the name of the node.


2.PNG


  • Reduce manual search through every windows that saves lot of time while migrating old sales organization to new sales organization.
  • Display The Styles Used In The Smart forms
  • Display the sales organization or any other hard coded value details used in conditions of window or code lines.
  • Display with the exact node, window and page of the search string in the ALV format.


3.PNG

  • Need to pass the downloaded XML file in the selection screen or we can use standard program RFRECPSFTLXML and the smart form into XML format.

4.PNG

 

  • It uses the class “CL_XML_DOCUMENT” to read the inputs that have been provided in the selection screen. This tool converts Parses DOM to XML Stream (String) using the method “RENDER_2_XSTRING”.
  • Using the function module “SMUM_XML_PARSE” it will converts the XML file data into the SAP internal table.
  • Form “SCAN_CODE” is used to find the hard code used in the global data declaration, initialization and form routines.
  • After that we will get the details of pages, windows and nodes used in the smart forms.
  • Table “ZSF_HARDCODE” used to maintain the entries to search for the old sales organization or string need to be searched.
  • It displays Styles used in this particular smart form.


How to Find the Exact Node of the Smart Form:

 

Please refer the below link,

Trick to Find Any Node in Smartforms - ABAP Help Blog

 

 

Note/Need Help:

For temporary purpose I created this program within a week to avoid manual check for so many forms if you have any suggestions to improve the performance or any other feedbacks are always welcome.

One main disadvantage I am facing is if same window name used for next page the name of the form is not coming inside the internal table, system itself assign some unique window name and assign in its internal table.

Not able to find how many standard include text (SO10) are used in the smart forms,

 

8.PNG

Last But Not Least:

 

1.PNG

 

2.PNG

In This table we will give the requires sales organization text

3.PNG

 

The code have been attached in the document for your reference please find the below sample screenshots. This article has been created due to doubt asked by me in SDN(http://scn.sap.com/thread/3614818).

Special thanks to “Eitan Rosenberg” who shared the following threads and makes me to think about the possibilities. Thanks for each SDN members support.

1.    http://scn.sap.com/community/abap/blog/2014/09/07/basic-smart-form-source-scan

2.    http://scn.sap.com/docs/DOC-55861.


I am not able to attach more than three files I have make Top include and selection screen in same file please copy and create separate includes.Thanks for understanding.




How to submit the report from V2 update process?

$
0
0

Hello mates,

 

I believe this post will seem really strange solution for most of gurus here but there was the only solution i have found to solve the task.

 

Sometimes you have set of programs ( standard or client but that were written many years ago ), that are working well and don't need to be redesigned to new class-based apporach. And you logic have the requirement to run these reports from update tasks.

 

Why do I have such strange requirement? Let's say we have a report that regenerates someting. And I have an update process that should include also this regeneration finally. And this call should be done only after DB commit, and of course it shouldn't be called in the case of errors. And yes - it takes a lot of time that's why we call update FM in V2.

 

So initially we are here:

 

InUpdateTask.png                Submit_from_rfc.png

 

So if you try to submit the report directly t from the SET_STATUS_FM that has been run in update task you will definetely get the short dump. Some of my team mates were even sure there were no way to do this without getting the dump but we finally handled that. Calling report using via job is not also possible here.

 

After spending some time on researching I figured out that it's possible to submit the report from transactional RFC. So I prepared some test program to prove that. It was looking like this:

 

Submit_from_rfc2.png

 

Next step was assumed just to call this UPDATE_INDEX_RFC from SET_STATUS_FM but this is prohibited to call tRFC from update processes and you will also get a dump here. But the solution was found - and the name is bgRFC.

 

Frankly speaking I never consider this technique providing by SAP for a long time because always thinked it's the same qRFC and tRFC but class-based designed.  But that's not like this. There is at least really powerful features that bgRFC provides - you can call it from update processes.

 

I also skip the bgRFC functionality description. All the related information you can find here:

bgRFC (Background Remote Function Call) - Components of SAP Communication Technology - SAP Library

 

So I configured bgRFC according to provided help. Also I created new inbound destination. Then I changed the test program with the code like this:

 

  try.            .      data(lo_dest) = cl_bgrfc_destination_inbound=>create( 'ZUPDATE_INDEX'  ).      data(lo_unit) = lo_dest->create_qrfc_unit( ).      lo_unit->add_queue_name_inbound(        exporting          queue_name                   =  'ZUPDATE_INDEX'       ).      CALL FUNCTION 'ZUPDATE_INDEX_BGRFC' IN BACKGROUND unit lo_unit.    catch cx_bgrfc_invalid_destination.    endtry.

And the sequence of calls became like this:

Submit_from_rfc3.png

 

That worked well also and the last step was to combine all the calls in one chain:

 

Submit_from_rfc4.png

 

You probably are thinking now I'm crazy and look at this as at piece of trash. But if it is, I would be glad to receive an alternative way to do this =).

 

In anyway we achieved several goals at the same moment:

  • we reused the report as we have it. No code refactoring inside.
  • If update process fails bgRFC doesn't start
  • bgRFC starts only after commit statement
  • bgRFC uses already updated database

 

There is at least one point I would pay attention to also:

  • If bgRFC crashes the update is not being rolled back.

 

But the last point was OK for us so we didn't care about that. It's similar to scheduled report run as a job.

 

I hope this information could be helpful for some of you.

 

Best regards,

 

Petr Plenkov.

Data Pre-Validation TOOL – Envision Perfect SAP Master Data Validation

$
0
0

Let me ask one question before we touch base to Data Pre-Validation tool: What is a core business principle to run a successful company?


Answer is “Assets must be managed meritoriously”.  However there has been often overlooked asset, particularly with wide business management systems such as SAP ERP. This asset is “Core Data” which includes Master data and Transactional data.


Managing large amounts of data can be significant challenge to most organizations. Some of data management tasks include:



 

Screensht1.jpg


Above are some common data management tasks. Among these data validation is most important task where “Data Pre-Validation TOOL” comes into picture.


  • Data Migration : Loading legacy system data to SAP applications
  • Data Validation: Validate and ensure clean data is getting loaded to SAP system
  • Data Maintenance: Mass load of data to SAP
  • Data Creation: Creating new master data (i.e. Customer/Vendor/Materials, etc.) or Transactional Data (Sales Document/Invoice, etc.)


Every data loading project should have a plan that includes quality and user acceptance as the top priorities. Preparation is the key to the success of any operation and data loading is no exception. So before loading data to SAP systems data must be validated correctly and we are introducing similar excel based data validation tool here.



When business users make mass changes to data as part of SAP data maintenance, they often first extract the data into a more user-friendly format, such as an Excel spreadsheet, make the changes, and then upload the data back into SAP.


We are going to see how we can implement excel based Data Validation tool.


Pre-requisites to implement the tool: Microsoft Excel VBA coding / ABAP coding knowledge


Introduction:  This blog will help to understand and implement Data Pre-Validation tool for SAP Master Data. As we are aware about importance of clean and correct data, this tool will help data validation with user friendly approach.

 

 

What we will cover in this blog?


  1.   What is Data Pre Validation TOOL?
  2.   Major Pain-point in day to day master data management
  3.   Prerequisites for Implementing the TOOL 
  4.   Concept Behind the TOOL
  5.   DEMO
  6.   Benefits of the TOOL



Let’s begin!!!


  1. What is Data Pre-Validation TOOL?


  • Data validation ensures clean, correct and useful data. However due to user interface complexities, inflexible business processes, ineffective mass maintenance of master data and poor governance can severely hinder the process, leading to dirty data and large financial losses. This validation tool will help easier master data validation.
  • Automation Tool:   This simplifies Master Data Validation process.
  • SAP Data Management: Data pre-validation tool is designed to make SAP Data Management easier using everyday tool.
  • Well Suited for Non-Technical workforce: Easy user interface
  • Increases productivity and Reduces Efforts: Simplified tool will increase productivity.
  • Less ABAP code/Zero ABAP code.



   2. Major Pain-point in day to day master data management

 

    Below are the major pain points in SAP Master Data Management.


Screensht2.jpg

 

These are major pain points in Master Data Management in SAP which can be overcome via Data Validation Tool.


Let us see prerequisites for implementing this tool:


  3. Prerequisites for Implementing the TOOL  : Prerequisite to implement this tool is

  • Microsoft Excel Version 2010 and above
  • Excel VBA coding and SAP ABAP coding knowledge

 

We have seen pre-requisites to implement the tool, let us see concept behind the tool and implementation steps:



  4. Concept behind the TOOL: Below picture depicts the connectivity between MS-EXCEL and SAP.


Screensht3.jpg

Example with implementation steps:

 

Implementation divided into 2 parts (Excel & SAP side)

 

  • Excel File steps
  • Create excel file for user input.
  • Create user interface in excel sheet to input data for the vendor master data and output.
  • Write VB code to make connection with SAP.
  • Read the data from excel.
  • Validate data with SAP table.
  • Display results in excel sheet itself.

 

  • SAP side steps
  • Create function module in ABAP and keep it remote enabled i.e. RFC.
  • Add import, export and tables parameters as per requirement from MS Excel.
  • Write logic in RFC to validate the data.
  • Return validation results.


 

  • MS Excel – User Interface
  • Design excel as per below screenshot:


Input -> Sheet 1

 

Screensht4.jpg

 

Output -> Sheet2

 


  Screensht5.jpg

 

  • Performing DATA validation:

 

  • We can add two types of validations:
  • MACRO Based: Write Macro for validations such as ’Mandatory field value', ‘Enter value in capital letters only’, ‘Length of the field’, ’Enter Only Characters, ’Enter Only Numeric value’.


  • Such validations can be handled via Excel Macro without SAP connection.

       

         Let us see below example of MACRO: Write code on Worksheet_Change event.

         Below validation is checking whether particular column is empty which a   mandatory field then below validation is to highlight the column with Red color          and add comment.


Screensht6.jpg

 

 

  • SAP data validations:SAP side validations will come into picture when we want validate data from SAP tables. Example: Validate the country (here we want validate if user has entered correct country code) from T005 SAP table.

       

        Below are the high level steps on how we excel and SAP interface can be done.

 

  • Excel File steps
  • Create excel file for user input.
  • Create user interface in excel sheet to input data for the vendor master data and output.
  • Write VB code to make connection with SAP.


       Below are the high level steps on how we excel and SAP interface can be done.


 

  • Excel File steps
  • Create excel file for user input.
  • Create user interface in excel sheet to input data for the vendor master data and output.
  • Write VB code to make connection with SAP.
  • Read the data from excel.
  • Validate data with SAP table.
  • Display results in excel sheet itself.

 

  • SAP side steps
  • Create function module in ABAP and keep it remote enabled i.e. RFC.
  • Add import, export and tables parameters as per requirement from MS Excel.
  • Write logic in RFC to validate the data.
  • Return validation results.

 

   5. DEMO:


Please see below demo how the data can be validated:

 

Example 1: These are MACRO based validations:


  1. STRAS (Street) field is mandatory: So whenever field is kept blank, macro validation will highlight field with RED and will add comment “This is Mandatory field. Please fill in value.”

Screensht7.jpg


    2. NAME1 (Name) field should have capital letters and length should not be greater than 40 characters: So whenever value entered in small letters or      digits or characters more than 40 chars, macro validation will highlight field with RED and will add comment “Max 40 Chars Caps only.”

 

Screensht8.jpg

Example 2: These are SAP based validations:

 

  1. SAP validations will work on validations which are included in RFC (function module) code and return results to Excel.

        Below is small demo how the results are returned from SAP to excel: For these validations “RFC_READ_TABLE” FM can be used.

       

        So without any coding in ABAP these validations can be performed.

Screensht10.jpg



   6. Benefits of the TOOL: Below are the benefits of the tool which are self-explanatory.


Screensht11.jpg


This way we can use Excel to SAP interface for Data Validation. This technique will be beneficial when data is huge. Furthermore we can also design dynamic rules to perform validations.

 

 

I hope this blog gave helpful insight on SAP data validation concept.

ABAP and Line Feeds

$
0
0

What is a line feed?

 

A line feed is a control character as defined in Wikipedia. In former times such control characters were used to mechanically feed a line in a teletype writer or to let the carriage literally return to the left of the paper. Nowadays, such control characters are mainly used to position cursors in outputs. Many output programs interpret the most important control characters for line feed, return, and tabulator according to their meaning. In source code, HTML or XML files, such control characters are handled as whitespace characters. They are not interpreted as part of the file but can help to make it more readable.

 

How to get a line feed character in ABAP?

 

If you search the Web for how to get a line feed character or another control character in ABAP, you mainly find the class CL_ABAP_CHAR_UTILITIES. And in fact, you can use its attribute CL_ABAP_CHAR_UTILITIES=>NEWLINE for that. By but why so complicated? Since Release 7.02, the string templates that are enclosed in |-delimiters, allow you to denote a line feed directly.

 

|....\n...|  for a linefeed among other contents or |\n| for a single line feed character.

 

Other control characters supported by string templates are \r and \t. If you only need those, no need to use CL_ABAP_CHAR_UTILITIES. To prove that, run the following lines of code in your system:

 

ASSERT cl_abap_char_utilities=>newline        = |\n|.
ASSERT cl_abap_char_utilities=>horizontal_tab = |\t|.
ASSERT cl_abap_char_utilities=>cr_lf          = |\r\n|.

 

Where to use a line feed character in ABAP

 

First, where  to use it not. You cannot use a line feed character or anyother control character in classical list programming. The following line


WRITE |aaaa\nbbbbb|.


produces an output like aaaa#bbbbb. The classical list processor does not recognize the control character and it does not treat it as a whitespace. It is an unknown character that is displayed as #. In classical list programming you have to use two (chained) WRITE-Statements:

WRITE: |aaaa|, / |bbbbb|.


Now where to use line feeds and other control characters? You use them,if you want to send them somewhere, where they are understood. E.g., writing to a file or sending to other displays than classical lists:

 

cl_demo_text=>show_string(
  |<html>| &&
  |  <body>| &&
  |    Hello!| &&
  |  </body>| &&
  |</html>| ).

cl_demo_text=>show_string(
  |<html>\n| &&
  |  <body>\n| &&
  |    Hello!\n| &&
  |  </body>\n| &&
  |</html>\n| ).

 

While the first output gives

 

<html>  <body>    Hello!  </body></html>.

 

The second gives

 

<html>

  <body>

    Hello!

  </body>

</html>

 

Last but not least:

 

 

cl_abap_browser=>show_html(
  EXPORTING html_string =
    |<html>\n| &&
    |  <body>\n| &&
    |    Hello!\n| &&
    |  </body>\n| &&
    |</html>\n| ).

 

gives the exepected output, the browser ignores the line feeds.

 

That's all ...


You can get plenty of information about this if you can Google......

$
0
0

Hi,


From time to time we see as a response to a question something like:

"You can get plenty of information about this if you can Google.."


I agree with that response but searching can be frustrating .


But there is a helping hand , introducing "Google Advanced Search" http://www.google.com/advanced_search

 

Lets do a search for "Smart Forms":

 

screenshot_01.png


We can see that this was translated to: "Smart Forms" site:help.sap.com


screenshot_02.png


The search use "Search operators" a full list of those can be found using: "Search operators" site:google.com


Note the warning.


screenshot_03.png

 

Let continue with our search for "Smart Forms" .


I am interested in tables so I go to "Printing Data in a Table" .

(After opening some nodes manually)


screenshot_05.png

 

We can see that we have "download" lets use it .


The result will be a PDF file that we can read in leisure....


Here is the Table of content of the generated PDF:


screenshot_04.png

 

Happy searching.

 

Regards.

 

p.s. Advanced search is not unique to Google for example Bing also support "Advanced search keywords"
http://onlinehelp.microsoft.com/en-us/bing/ff808421.aspx?sl=1

Some Simple Innovative Ways - PDF Output For ABAP Reports

$
0
0

It was quite a while that something interesting came up like this one. For Old school developers in ABAP like myself who have found comfort working on ABAP for a decade become so ingrained in it that we fail to think out of the box, and find solace in what we know. Even treading into OO ABAP is uneasy feeling and best avoided, but as time ticks on we try to keep pace with the changing technology lest we are left miles behind.

 

Sometimes we need we come out of our shell and explore the new possiblities of combining technologies to reach a solution. Recent times has been like a storm has passed over you and am lucky to be still alive , with clients demanding more bang for the buck and difficult to appease we need to find some innovative ways to stay afloat.

 

Off late there was bolt from the blue requirement from the client which sent a chill down the spine as we were on the verge of completion of the build, the requirement from their perspective was simple, they wished or rather wanted all the ALV reports to be displayed in PDF. Though initially we cowered a bit, we planned to take it as a challenge as it hurts the pride of an ABAPer when a comparison is drawn with other legacy software applications which can do such things immaculately. There were also possibilities in SAP by spooling the data and converting to PDF or use of a pdf printer, but those solution had some inherent shortcomings, because as a developer we have no control on the form per se, and many finer requirements could not be met using that approach.

 

From my previous experiences with Adobe forms I knew that such a requirement cannot be met just by ABAP, we need to have a multi-faceted approach, yes, we needed more arrows in the quiver. I was aware that adobe forms can be dynamically manipulated using javascript or formcalc, and have also come across concepts of dynamically rendering sub forms using instance manager scripts. Browsing adobe forums helped a lot to come up with a solution, I came know that JavaScript Object Notation(JSON) is way of representing data as a string object model and the same can be parsed using javascript, now I needed to find a way to convert SAP data to JSON format, browsing and searching we found a class in ABAP which  converts an internal table into a JSON string and that is exactly what we wanted, but we had to clone and modify the class a bit as the JSON produced by it was not being interpreted by javascript correctly.

 

So, as the first step we cloned the class CL_TREX_JSON_SERIALIZE to ZCL_TREX_JSON_SERIALIZER and made modifications to recurse method to put single quotes across data. Tested the class and this brought out what we exactly required.

Sample Output String

[{'pernr': '14900001', 'begda': '20140922', 'endda': '20140923'}, {'pernr': '14900002', 'begda': '20140922', 'endda': '20140924'}]

 

The next step was to design the form which will render dynamically based on the JSON String. The form is a very simple one,

it just has a subform with a table contained in it, the table has one row and a colum Cell which is wrapped in repeatable subform name Column. We had also made certain other dressups like the header, footers, and the logo.

 

Form Layout

Form.PNG

 

The form has a very simple interface where data is passed as a JSON string, we did incorporates more fields in the interface to help us manipulate the form using Java script.

 

Form Interface

 

 

Interface.PNG

Having done this, we need a small, but a bit complicated js code in place to parse the JSON string and then automatically build table using instance managers.

The javascript code is fired in the formready event of DIS subform.

FormJS.PNG

The createTable function does the following,

1. It calculates the cell width based on the page width and the number of columns.

2. It splits the JSON string into an array of Javascript objects, you can simply use the eval function in JS without use of any custom library functions.

3.Count the data length which is basically the rows in the internal table

4.Count the number of columns which is the number of columns in the internal table, this will be the number of attributes of JS object in the array.

5. Start rendering using instance manager commands.

 

ZigZag Rendering

Since we have just one cell which has to be instantiated as many times there are cells, we start rendering column by column in a zig-zag fashion as shown below.

The JS code which helps to instantiate several cells is dynamically constructed instance manager commands.

 

xfa.resolveNode(t).Column.instanceManager.addInstance(0);

Rendering.PNG

Since the idea of the blog is only explain the concept I have not pasted the entire JS code here.

 

Our next major challenge was how to plug this PDF to all the reports, fortunately for us, we had developed a reusable ALV display function module to minimize the coding for developers for simple ALVs, this function was being used in most of the reports. We enhanced the functional module GUI status and created a PDF function which will respond by passing the data to the form and displaying the same. So once we had enhanced this function all the reports got the PDF function with no extra effort, and interestingly this idea was conceptualized and executed in less than week.

 

Report and PDF display.

Report.PNG

Clicking on Generate PDF

Formdata.PNG

 

Some of the challenges we faced after initially implementing the concept,

1. Heading on each page was not getting displayed as it was dynamically rendered, this we overcame by having separte logic for header on the master page.

2. Modifying header labels - we did a replacement of generated headers from the data

3. The next major challenge was to display only filtered rows or columns in the PDF generate - from using ALV related functions we were able to trace out which colums and rows we need to display and based on that we built internal table dynamically before converting to JSON and transferring to PDF form.

4. Font type and size control - we were able to dynamically control font size and type using javascript based on the number of columns to display.

 

We still have challenges on performance front when data is huge, and also on customizing due its generic behavior, but at the end of day I believe we were able to conquer some unexplored frontiers, we analysed and joined the dots to come up with a decent solution. Thanks for reading this blog and aplogies for not being technically very detailed.

 

My sincere thanks to fellow developer Rajiv Kumar who steadfastness made this happen.

 

Regards,

Raghav

ABAP Dictionary - Extra Dry

$
0
0

Every ABAP Programmer knows the ABAP Dictionary, the metadata repository of the AS ABAP, where you can define semantically rich data types and database tables or views as the backbone of any serious ABAP application. Besides the ABAP Dictionary itself, there is the tool ABAP Dictionary, good old SE11, that is still not replaced by ADT in Eclipse.

 

When you are working with the ABAP Dictionary, you can use the F1-Help in SE11 and the SAP Library Documentation. The F1-Help as a rule is directly connected to a input field or function of the tool while the SAP Library Documentation is more task oriented. What was missing up to now was a comprehensive reference documentation for the ABAP Dictionary. Well, here it is:

 

ABAP Dictionary

 

This documentation drills down all the facts that were distributed over F1 and the SAP Library Documentation for data types, database tables and views, updated them were necessary and presents them in a totally new structure - and this very, very dry (no direct speech, shame, shame!). The documentation is divided in three parts:

 

 

It is part of the ABAP Keyword documentation. This means, links between language topics and  dictionary topics are fiinally possible. All dictionary objects that are important for the ABAP language are described in detail and independent from the tools, hereby making a distinction between their technical and semantical properties. The F1-Help of SE11 in Release 7.40, SP08 was adjusted in a way that you can chosse between branching to the SAP Library Documentation or directly to the respective reference documentation.

 

Your feedback is welcome.

#sitHH : Exposing data to #UI5 without using SAP Gateway, Part II: back to standard

$
0
0

You may (should) have read part I of this blog post already to understand the motivation why I haven't used SAP Gateway for this short example.

In part I you'll also find some information / coding I'm not going to repeat here.

 

The Motivation

 

The other day one of my customers upgraded his system (finally) to the most recent enhancement pack and I thought to myself "What A Wonderful World" no no no, "could I write my example just with the standard tools now?". I've played around with my own NW 7.40 system and with just a little adjustment of the UI5 application it worked.

 

Just today Renald Wittwer commented on part I of the blog post that he used my example in one of his projects. Nice :-) But also this was the reason for this part II. If you are already on a current Netweaver release (I think with NW 7.31 this already should work), you can (or better should) use the SAP standard.

 

The Solution

 

Instead of using ADL by DJ Adams we use the really nice REST framework around the classes cl_rest_http_handler and cl_rest_resource.

Instead of using my JSON document class we use the standard CALL TRANSFORMATION

 

The Data

 

No changes to the original example.

 

The JSON document


DATA salesorders TYPESTANDARDTABLEOF ysithhsalesorder.

 

SELECT * FROM ysithhsalesorder

  INTOTABLE @salesorders.

 

DATA(lo_json_writer) = cl_sxml_string_writer=>create(type = if_sxml=>co_xt_json ).

CALL TRANSFORMATION id SOURCE itab = salesorders RESULT XML lo_json_writer.

cl_demo_output=>display_json( lo_json_writer->get_output()).


Result


p1.PNG


As you can see, the only difference is that the JSON fieldnames (labels) are all in Upper Case now.

 

The Call

 

The standard REST framework is quite similar build like DJ Adams ADL (dispatcher/handler -> resource) and self explaining.

 

The Handler


CLASS ysithh_rest_test DEFINITION

  PUBLIC

  FINAL

  INHERITING FROM cl_rest_http_handler

  CREATEPUBLIC.

 

  PUBLICSECTION.

    METHODS if_rest_application~get_root_handler REDEFINITION.

  PROTECTEDSECTION.

  PRIVATESECTION.

ENDCLASS.

 

CLASS ysithh_rest_test IMPLEMENTATION.

 

  METHOD if_rest_application~get_root_handler.

 

    DATA(lo_router) = NEW cl_rest_router().

    lo_router->attach( iv_template = '/orders' iv_handler_class = 'YSITHH_REST_SALESORDERS_TEST').

 

    ro_root_handler = lo_router.

  ENDMETHOD.

 

ENDCLASS.


And again: don't forget to enter the dispatcher class into the ICF path via transaction SICF

p2.PNG

p3.PNG


The Resource

 

Not needed but interesting: In my resource, this time I'm also handling the so called "content negotiation", means that I'm asking what type of content the client wants to get as response.

 

CLASS ysithh_rest_salesorders_test DEFINITION

  PUBLIC

  INHERITING FROM cl_rest_resource

  FINAL

  CREATEPUBLIC.

 

  PUBLICSECTION.

    METHODS if_rest_resource~get REDEFINITION.

  PROTECTEDSECTION.

  PRIVATESECTION.

ENDCLASS.

 

CLASS ysithh_rest_salesorders_test IMPLEMENTATION.

 

  METHOD if_rest_resource~get.

 

    DATA(lo_entity) = mo_response->create_entity().

 

    DATA:

      lt_supp_cont_type TYPE string_table,

      lv_content_type   TYPE string,

      lv_accept         TYPE string.

 

* create the list or the content types supported by this REST server

    lt_supp_cont_type = VALUE #(

      ( if_rest_media_type=>gc_appl_json )

      ( if_rest_media_type=>gc_appl_xml )

      ( if_rest_media_type=>gc_text_plain )

      ( if_rest_media_type=>gc_appl_atom_xml_feed )

    ).

 

* get accept value from REST client

    lv_accept = mo_request->get_header_field( if_http_header_fields=>accept ).

 

* find the best "matching" content type from the supported list using the client's input

    TRY.

        lv_content_type = cl_rest_http_utils=>negotiate_content_type(

          iv_header_accept         = lv_accept

          it_supported_content_type = lt_supp_cont_type ).

        IF lv_content_type ISINITIAL.  " no supported format found -> http 406

          mo_response->set_status( cl_rest_status_code=>gc_client_error_not_acceptable ).

          mo_response->set_reason( if_http_status=>reason_406 ).

          RETURN.

        ENDIF.

      CATCH cx_rest_parser_error.

        mo_response->set_status( cl_rest_status_code=>gc_server_error_internal ).

        mo_response->set_reason( if_http_status=>reason_500 ).

        RETURN.

    ENDTRY.

 

    mo_response->set_header_field(

      EXPORTING

        iv_name = 'Content-Type'    " Header Name

        iv_value = lv_content_type    " Header Value

    ).

 

    DATA salesorder TYPESTANDARDTABLEOF ysithhsalesorder.

 

    SELECT * FROM ysithhsalesorder

      INTOTABLE @salesorder.

 

    CASE lv_content_type.

      WHEN if_rest_media_type=>gc_appl_json.

        " Transform data to JSON

        DATA(lo_json_writer) = cl_sxml_string_writer=>create(type = if_sxml=>co_xt_json ).

        CALL TRANSFORMATION id SOURCE itab = salesorder RESULT XML lo_json_writer.

        lo_entity->set_content_type( if_rest_media_type=>gc_appl_json ).

        lo_entity->set_binary_data( lo_json_writer->get_output()).

 

      WHEN if_rest_media_type=>gc_appl_xml.

        " Transform data to XML

        CALL TRANSFORMATION id SOURCE itab = salesorder RESULT XML DATA(lv_xml).

        lo_entity->set_content_type( if_rest_media_type=>gc_appl_xml ).

        lo_entity->set_binary_data( lv_xml ).

 

      WHEN if_rest_media_type=>gc_appl_atom_xml_feed.

        " Transform data to Atom

        DATA: ls_feed  TYPE if_atom_types=>feed_s,

              ls_entry TYPE if_atom_types=>entry_s.

        ls_feed-id-uri = 'http://www.sap.com'.

        GETTIMESTAMPFIELD ls_feed-updated-datetime.

        LOOPAT salesorder ASSIGNING FIELD-SYMBOL(<f>).

          ls_entry-title-text = | { <f>-id }-{ <f>-company_short }|.

          CONVERT DATE sy-datlo

            INTOTIMESTAMP ls_entry-updated-datetime TIMEZONE'UTC'.

          ls_entry-title-type = if_atom_types=>gc_content_text.

          APPEND ls_entry TO ls_feed-entries.

        ENDLOOP.

        DATA(lo_provider) = NEW cl_atom_feed_prov().

        lo_provider->set_feed( ls_feed ).

        lo_provider->write_to( lo_entity ).

 

      WHEN if_rest_media_type=>gc_text_plain.

        lo_entity->set_string_data('Content type:'&& lv_content_type ).

    ENDCASE.

 

    mo_response->set_status( cl_rest_status_code=>gc_success_ok ).

    mo_response->set_header_field(

      EXPORTING

        iv_name = 'Access-Control-Allow-Origin'    " Name of the header field

        iv_value = '*'    " HTTP header field value

    ).

 

  ENDMETHOD.

 

ENDCLASS.

 

The Test

 

In this case we have to use a REST client like the Chrome plug-in "Postman", because we have to send the "Accept" header field for the "content negotiation".

No surprises here, result is like in part I except of the upper case labels.


p4.PNG


The App

 

The UI5 application is the same like in part I. The only adjustments you have to make is the new URL in the controller.js and the labels in main.view.xml. The labels must be all in upper case now.

 

p5.PNG

 

The Result

 

p6.PNG

Yea, no differences

 

Epilogue

 

The JSON document class is not dead yet. If you need special handlings of the JSON input/output like date format, lower case, table appends etc. (more features you can find in the wiki on Github) you still can and should use the class. Also the class is still in "standard maintenance"

 

Appendix

 

 

You can find me on Twitter and G+

TechEd && d-code : First Timer

$
0
0

http://scn.sap.com/servlet/JiveServlet/showImage/38-112012-519670/DCODE.png

I will be attending TechEd && d-code this October, in Las Vegas, for the first time in my SAP career.  I am really looking forward to the opportunity to rub elbows with like minded SAP developers and other SAP professionals.

 

For several years I have been trolling SCN for solutions to one problem or another.  I can safely say that with SCN, a handfull of classes, and the seat of my pants, I have learned to become an ABAP developer.  Most ot the time, I was working on what felt like an island, as I did not have any ABAP collegues close at hand.  That has changed significantly within the past year and now I have the opportunity to attend this conference, of which I have read a great deal about and watched presentations from prior years.  I am looking forward to taking another step forward in my professional career and taking in everything the conference has to offer.

 

My Plan...

The highlights are:

SessionTitle
DEV262Evolution of the ABAP   Programming Language
TEC102A Comprehensive Introduction to SAP HANA
DEV264Custom ABAP Code – Get Ready for SAP HANA!
TEC201Time to Rethink Your Code Patterns – New Boundary Conditions in Application
DEV212How to Architect ABAP Applications for SAP HANA
DEV165Code Better with ABAP in Eclipse
TEC202Overview and Road Map of SAP NetWeaver 7.4
SEC105SAP Runs SAP – How to Hack 95% of all SAP ABAP Systems and ow to Protect

 

The hardest part is trying to put together an agenda that captures everything you want to learn about.  I understand now, where others have said, you need to clone yourself.  As you can see from above my interests lie deeply in the development world but I'll be sprinkling in some other sessions in pertaining to data management and user interfaces.  As it has been suggested, I probably will add in a few overlapping sessions for alternatives.

 

Looking Forward To...

Learning about using ABAP in Eclipse which I read so much about here on SCN.  I have yet had the opportunity to use it first hand.  Plus, there seems to be so many new enhancements to the ABAP language with the most recent release, I need learn more about this.

 

HANA is another area of interest.  My company has recently implemented it for BW/BI purposes but not yet for other development.  So my understanding of the platform is limited.  I want to walk away with a much deeper understanding of HANA.

 

Also, never been to Las Vegas, so I am going to take in a few sites and shows with my wife.

 

Advice to first timers...

I'll be able to answer that better next year!  But to echo some advice I've already seen posted.  Wear comfortable shoes and keep in mind the time between sessions.  A colleague did tell me that making sure to get your schedule settled and as soon as possible, as seats do fill up for the workshops.

 

Thank you and look forward to meeting some of you experts in Vegas!

 

Regards,

Justin Loranger

Viewing all 948 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>