Quantcast
Channel: ABAP Development
Viewing all 948 articles
Browse latest View live

Please, start thinking international when you develop.

$
0
0

Hi there community,

 

This will be a short blog post, for 2 reasons. First reason, I have some time to kill, but not that much! Second reason, I'm just writing this so I have a place to quickly find some information I keep searching over and over again.

 

This blog post will be about something that should be VERY clear for any seasoned developer, but unfortunately many times I see things done in a way that is going to fail very often. We are talking about currencies, units, and their external/internal formats and representation.

 

 

Writing a value or quantity to the screen or a form

 

When it comes to units, this is not so critical. The most popular option I have seen for this is WRITE variable TO output_field DECIMALS number_of_decimals. So, if you're displaying kilos, or square meters, or something like that, most of the times it will work decently. The next thing I see very often is a suppressing of the decimal places if the unit is pieces. Or if the decimal part of the number is zero. So quickly you get a lot of "ifs", you get added complexity, and as soon as there is a new unit you didn't think about, this fails.

 

When it comes to values (as in, money value), this is terrible. I think I have never seen this developed properly in my life. It seems that most people fail to understand that not all currencies were created equal. Most of them have 2 decimals, yes, but some have 3. And some, like my "ex-mother-currency", have none! Most of the times I don't even see a WRITE statement, I just see a MOVE TO. And then you get no thousand separator, and the decimal separator is always a point, regardless of where you are. There is also usually a fair amount of logic in the form to make up for this, that is completely unnecessary.

 

So what should you do?

 

You should learn more about the WRITE statement. The WRITE statement has very powerful additions CURRENCY and UNIT.

 

This will make use of the proper representation of the unit or currency you are displaying, and you have no extra work. What more could you ask?

 

Also, if for any reason there's a unit or currency in your system that doesn't seem to have the right format, you can use transaction codes CUNI and OY04 to maintain this output format.

 

 

Reading a number from an external format (like an excel sheet or text file)

 

This is also something that I see terribly implemented almost everywhere. The most popular way of converting a number from an external format (i.e. 1.123,12 for Europe), is to use the REPLACE statement abundantly. Followed by a condense... This works fine as long as this code will always be executed in the same "locale", but that's not always the case. And if it's not, what do you do? As many IF statements as you can think of? Wrong.

 

So what should you do?

 

You should use function module MOVE_CHAR_TO_NUM. This function module reads the setting of the user, and will convert the number accordingly. So, assuming the person is uploading numbers that follow the format said person is using in SAP, this will always work.

 

There are situations where this might still not work. For example, if it's some sort of background job being executed by a batch user, and this job is supposed to deal with different formats, this will be a hard task to achieve. You will need proper design to deal with this situation.

 

 

Conclusion

 

Now you don't have an excuse any more to use those silly SAP Script additions, to suppress zeros, to condense, to hide the decimals, etc. Just use the WRITE statement with the correct options.

 

When it comes to reading a number, if it's from an excel or a csv file, you are better off using my awesome file reader anyway.

 

If you have any further suggestions or comments let me know, I can update this blog post.

 

Cheers,

Bruno


Vital Educational SAP Conference or Drunken Jolly?

$
0
0

Mastering SAP Technology 2015 in Melbourne

24th May to 27th May


Melbourne Ball of Wool.jpgMelbourne Ball 2.png

 

We'll be Having a Ball (of wool)

 

Next week in Melbourne is the Mastering SAP Technology conference, kicking off with SAP Inside Track on the Sunday before the conference starts. I am flying out tomorrow lunchtime. This year the conference is co-located with the Mastering HR and Mastering Finance conferences, so it will be three times as big as usual, sort of an Australian mini-SAPPHIRE.

 

Melbourne has a very European nature for an Australian city, with tiny bars hidden in side streets (they don't have signs, they hide themselves, you have to know in advance where they are) and trams including a mobile restaurant tram (just like in Heidelberg in Germany). It is also filled with women pushing enormous balls of wool through the streets.

 

Expensive Junket?


I was asked at lunch this fine day "do you actually learn anything at these things?". It has to be admitted - that is the perception of these events, and indeed of SAPPHIRE itself. They are viewed as a giant party in Las Vegas or Orlando or wherever - at your companies expense - where the delegates just party like there is no tomorrow and occasionally go to a presentation where someone from SAP try's to sell you software you do not want or need.

 

Now there is no smoke without fire - there are huge parties (Oh I mean networking events) and in the real SAPPHIRE events pop concerts with Bon Jovi and the like. In the SAPPHIRE in Brisbane in 2003 the lead singer of INXS broke off singing and shouted at the audience "you are all nerds" which proves he at least knew what sort of event he was performing at.

 

And I would be the world's biggest liar/hypocrite if I said I would not be at the drinking (networking) events on Sunday & Monday evenings. Last year Graham Robinson noted that the networking at the conference had been the best he had ever seen to which Martin English shouted out "I network like a fish".

 

There must be more to life, there must be more than this...

 

So, if that was all there was to it, people extorting money out of their employers, going to a party and claiming it was education/training, then eventually the employers would cotton on and such events would be no more. That has not happened - the Mastering SAP Technology event is in it's tenth year, and SAPPHIRE has been going a lot longer.

 

Could it be that there is something useful in these things after all?

 

The first point I would make is that at the "Mastering" conference, and indeed to a large extent at "real" SAP conferences, most of the presenters are not SAP or a member of the "ecosystem" trying to sell you something but rather a bona fide SAP customer telling you the problems they have had and how they solved them (or more importantly failed to solve them) using SAP technology. People love to hear "warts and all" war stories - we all know no technology is perfect, we just laugh at salesmen who tell us it is, solves all your problems "out of the box" and takes just four weeks to install and go live "with not one line of code".

 

The Eventful Group run focus groups in each state of Australia (and in New Zealand) where assorted (SAP using) companies say what their pain points are and where they are struggling to use technology to solve those business problems. Then there is a sort of democratic vote as to what the most pressing problems are. And then you get a bottle of wine. You can't keep me away from such things, which is not to say the business challenges of my organisation are not real.

 

Thus a "hot list" of topics is drawn up and a call for speakers is made. Assorted people submit presentations that fit that list and these are voted upon by a peer group of people like me i.e. nothing to do with the company organising the conference. Thus the end result is that all the speakers are talking about topics that have been chosen by a popular vote, and their particular talk has been chosen by another voting process. The communists would be rolling in their graves.

 

The point here is that the resulting presentations are not adverts trying to sell you something at random. They are speeches by real people describing how they solved real world business problems - problems which other real people claim they have.

 

Life's What You Make It


So far so good but here comes the important bit - last year it was stressed again and again that it is so easy to just stay in your comfort zone. You are there with three friends from your company, and you sit next to them at the presentations, and have the sandwiches with them at lunchtime, and drink with them in the evening, and never even talk to anyone else during the whole event.

 

Can you see the problem with that? You are bound to have some sort of problem you are fixing or new enhancement you are making, and unless you are the greatest genius the world has ever known you are bound to be struggling with this to some extent. Somewhere in the room is someone else -  maybe several somebodies - who have the exact same problem as you and they have solved one part of it and you have solved another part of it. And yet you never talk to each other even though they are only five feet away from you.

 

Talk Talk


After each speech there is "speakers corner" where the delegates can asked detailed questions about whatever it is the presentation was just on. Furthermore I am willing to bet each speaker is willing to talk about whatever it is both before and after the presentation.

 

And that's just the speakers. The actual other delegates are likely to be mines of information also if you could just bring yourself to have a chat with them.

 

Last year I was nice and hung over as usual and felt the need to sit down on a sort of sofa thing and put my head in my hands. Just then an Indian gentlemen recognised my name hanging from my name tag and sat down next to me and started asking questions about HANA. Now I don't claim to know everything - or even that much - about HANA but it turned out I knew more than he did and was happy to tell him everything I did know.

 

The same applies this year - if you see me during the drinks or wandering around the booths collecting pens and yo-yos and fluffy toys, ask me about any of the nonsense I wrote about in my book ; ABAP 740, BOPF, BRF+, Monsters, whatever, I don't mind.

 

Likewise that is what the SAP Mentors who are there are for also - people like Graham Robinson and what have you. There will be lots of SAP Mentors there and they are bursting to help you.

 

And indeed you may well be able to help the other delegates there - one of the themes from last year was "help yourself then help others" like what you hear on an airline in regard to oxygen masks. Naturally it is human nature to be self-focussed, so work with that - see if you can't find the answers to your own problems, so you can justify the expense of coming back the next year, but once that is sorted, see if you can't help out other people who have questions you can answer. Those two activities can be concurrent.

 

Right Justified


You do need to justify why you are there if someone is paying for you, and I make a little summary of what was useful out of each presentation I go to.

Last year you were encouraged to write on a post it note something like "I can use XYZ tip I just learned at work next week", or even just the bullet points - there are usually only three core messages from each presentation, write them down if nothing else.

 

That way when your CIO asks you what you got out of it, you can actually present them with a list of information rather than saying "a big bag of toys and a huge hangover".

 

Happy Happy Happy Happy Talk


So to hammer home the point, the key to success here is to get out of the comfort zone and talk to people you do not know. A lot of IT types hate this as in "if I wanted to talk to other human beings I would have not been a programmer" but give it a go.

 

At the SAP Inside Track there will what can best be described as SAP Speed Dating, where you sit on a table with six other people and you all say who are you are, and then a bell rings and you have to go to a table with six other people and the process repeats until you have a least a general idea of what everyone in the room does for a living.

 

Wool I Never

 

That was a fairly incoherent stream of consciousness about what I think about SAP conferences, and I apologise if I went round in circles, but hopefully you get the point. In any event I hope next week to talk to a lot of people I do not know, and answer lots of questions, and learn new things myself.

 

Cheersy Cheers

 

Paul

 

P.S.


Oh and I almost forgot!

 

Another vital part of going to any SAP conference is taking pictures of the dustbins/trash cans and posting those pictures on SCN.

 

There is a top secret contest about this each year, it is so top secret you never hear about it anywhere.

 

Last year SAP looked at all the various dustbin photographs throughout SCN blogs and the winner got a thousand billion dollars and a life sized gold plated model of Mount Everest (encrusted with diamonds) and all the fish they could eat.

 

Mind you this is a double edged sword as the loser has to pay a forfeit, which this year was to have to drink ten bottles of whiskey laced with LSD one after the other and then design a new SAP GUI theme and do all the coding for the same, all in ten minutes after finishing the last bottle, which is how we got the Blue Crystal Theme and the initial version of the SAP 740 GUI.

Using Checkpoint Group to Debug ABAP in Background Processing

$
0
0

Debugging ABAP code in background processing is a common requirement. How developers usually approch this task is by coding time delay or dead loop to take control of the process in trx. SM50 and debug it. There is no easy and graceful way to activate or deactivate such "break-point" in production environment.

Checkpoint Group provide flexibilty and ease of activation and deactivation of break-points using trx. SAAB. For example, break-point can be activation per specific user and will not affect other users.

SAAB.jpg

If you execute ABAP program online execution will stop at activated break-point.

Debugger1.jpg

The same break-point will not interrupt execution in background processing. There is a way to use the same Checkpint Group to interrupt ABAP processing in both dialog and background processing. What it takes is to code activatable break-point using ZCL_AAB=>BREAK_POINT custom class method.

Z_DEMO.jpg

 

ZCL_AAB=>BREAK_POINT method checks if activatable break-point exists (calling ZCL_AAB=>EXISTS method) and active (calling ZCL_AAB=>IS_BREAK_POINT_ACTIVE method). If break point exists and active then in dialog processing execution is interrupted using BREAK-POINT statement, othersize is background processing excution is delayed for 60 seconds (calling ZCL_AAB=>TIME_DELAY method).

 

class ZCL_AAB definition
 
public
  final
 
create public .

*"* public components of class ZCL_AAB
*"* do not include other source files here!!!
public section.

 
class-methods BREAK_POINT
   
importing
      !IV_AAB_ID
type AAB_ID_NAME .
protected section.
*"* protected components of class ZCL_AAB
*"* do not include other source files here!!!
private section.
*"* private components of class ZCL_AAB
*"* do not include other source files here!!!

 
class-methods EXISTS
   
importing
      !IV_AAB_ID
type AAB_ID_NAME
    returning
     
value(RT_EXISTS) type CHAR1 .
 
class-methods IS_BREAK_POINT_ACTIVE
   
importing
      !IV_AAB_ID
type AAB_ID_NAME
    returning
     
value(RT_BREAK_POINT_IS_ACTIVE) type CHAR1 .
 
class-methods TIME_DELAY .
ENDCLASS.



CLASS ZCL_AAB IMPLEMENTATION.


* <SIGNATURE>-----------------------------------------------------+
* | Static Public Method ZCL_AAB=>BREAK_POINT
* +--------------------------------------------------------------+
* | [--->] IV_AAB_ID                      TYPE        AAB_ID_NAME
* +---------------------------------------------------</SIGNATURE>
METHOD break_point.
DATA: w_textTYPE string.

 
IF exists( iv_aab_id) = SPACE.
   
CONCATENATE 'Checkpoint Group' iv_aab_id'does not exist'
     
INTO w_textSEPARATED BY SPACE.
   
MESSAGE w_textTYPE 'I'.
   
EXIT.
 
ENDIF.

 
IF is_break_point_active( iv_aab_id) = 'X'.
   
IF cl_gui_alv_grid=>offline( ) IS INITIAL.
*     Foreground
     
BREAK-POINT.
   
ELSE.
*     Background
      time_delay
( ).
   
ENDIF.
 
ENDIF.

ENDMETHOD.


* <SIGNATURE>---------------------------------------------------+
* | Static Private Method ZCL_AAB=>EXISTS
* +-------------------------------------------------------------+
* | [--->] IV_AAB_ID                      TYPE        AAB_ID_NAME
* | [<-()] RT_EXISTS                      TYPE        CHAR1
* +--------------------------------------------------</SIGNATURE>
METHOD exists.
DATA: w_aab_idTYPE  aab_id_name.

 
SELECT SINGLE name
 
INTO w_aab_id
 
FROM aab_id_prop
 
WHERE name = iv_aab_id.
 
CASE sy-subrc.
 
WHEN 0.
    rt_exists
= 'X'.
 
WHEN OTHERS.
   
CLEAR rt_exists.
 
ENDCASE.

ENDMETHOD.


* <SIGNATURE>---------------------------------------------------+
* | Static Private Method ZCL_AAB=>IS_BREAK_POINT_ACTIVE
* +-------------------------------------------------------------+
* | [--->] IV_AAB_ID                      TYPE        AAB_ID_NAME
* | [<-()] RT_BREAK_POINT_IS_ACTIVE       TYPE        CHAR1
* +--------------------------------------------------</SIGNATURE>
METHOD is_break_point_active.
DATA: wa_aab_id_actTYPE aab_id_act,
      wt_aab_id_act
TYPE aab_id_act_tab.
DATA w_bit_valueTYPE i.
FIELD-SYMBOLS <mode_x> TYPE x.
CONSTANTS: c_breakpointTYPE i VALUE 8.

 
SELECT * FROM aab_id_actINTO TABLE wt_aab_id_act
 
WHERE name       = iv_aab_id
   
AND is_program= SPACE.
*
 
LOOP AT wt_aab_id_actINTO wa_aab_id_act
                       
WHERE username = SPACE
                          
OR username = sy-uname.
   
ASSIGN wa_aab_id_act-actmodeTO <mode_x> CASTING.
   
GET BIT c_breakpointOF <mode_x> INTO w_bit_value.
   
IF NOT w_bit_valueIS INITIAL.
      rt_break_point_is_active
= 'X'.
     
EXIT.
   
ENDIF.
 
ENDLOOP.

ENDMETHOD.


* <SIGNATURE>---------------------------------------------+
* | Static Private Method ZCL_AAB=>TIME_DELAY
* +-------------------------------------------------------+
* +--------------------------------------------</SIGNATURE>
METHOD time_delay.
DATA: w_time_currTYPE tims,
      w_time_end  
TYPE tims.
DATA: w_timestampTYPE timestampl.

 
GET TIME STAMP FIELD w_timestamp.
 
CONVERT TIME STAMP w_timestampTIME ZONE sy-zonlo
 
INTO TIME w_time_curr.
  w_time_end
= w_time_curr  + 60.
 
WHILE w_time_curr< w_time_end.
   
GET TIME STAMP FIELD w_timestamp.
   
CONVERT TIME STAMP w_timestampTIME ZONE sy-zonlo
   
INTO TIME w_time_curr.
 
ENDWHILE.

ENDMETHOD.
ENDCLASS.


Lets see it in action. First run the program in background and take over control of the program in trx. SM50

SM50.jpg
Then once in debugger session, interrupt WHILE loop setting W_TIME_CURR at least 60 seconds in the future. For simplicity just set it to 235959 and click on Return (F7) button

Debugger.jpg

Press Return (F7) on next screen

Debugger2.jpg

Voilà you are in Z_DEMO program

Debugger3.jpg

Lets see how the break-point works in dialog processing. Run the program and execution is interrupted at BREAK-POINT statement, press Return (F7) button

Debugger4.jpg

Voilà you are in Z_DEMO program.

Debugger3.jpg

XML Schema (XSD) Validation in ABAP (with Limitations)

$
0
0

Introduction

In this blog post I will present a simple solution, but with limitations, for validation of XML Schema (XSD) in ABAP using a service consumer (Enterprise Service) and reusing some basis objects.

Use Cases

This can be useful in the following cases:
  • You generate an XML in your system and you want to perform schema validation.
  • You are consuming a web service, you are not using ABAP Proxies not PI and you want to validate the request and/or response.

How does it work

This technique consists in using the XSD file to generate  a WSDL (a fake service description), then use the WSDL to create DDIC structures and Simple Transformations with the Enterprise Service tools. We then reuse ABAP basis classes to parse the XML into the DDIC structures. During this step a limited Schema Validation will take place.
This is mostly based on the blog post Power of the Core : Using XSD for XML Transformation in ABAP (from Ram Manohar Tiwari with the difference that we are not using it to generate an XML file, but to parse it in order to check it.

How To

Generating the Service Consumer

Follow the instructions from steps 1 and 2 of the blog post Power of the Core : Using XSD for XML Transformation in ABAP.

The Code

Now use the following code in order to parse your XML into a DDIC structure, and thus perform the validation:
  DATA:
    " This is a structure generated by the framework for the
    " Root element of your XML
    ls_abap_data TYPE ztestramaccess_request_message,
    lv_xstring TYPExstring,
    lx_root TYPEREF TOcx_root,
    lx_st_error TYPEREF TO cx_st_error,
    lv_output_text TYPEstring.
  " If you already have your XML in an xstring, you may skip this
  cl_secxml_helper=>string_2_utf8(
    EXPORTING
      if_input = iv_xml " This is your XML in a string (I assume it as importing parameter)
    RECEIVING
      ef_output = lv_xstring
    EXCEPTIONS
      others=0
  ).
  TRY.
    cl_proxy_xml_transform=>xml_xstring_to_abap(
      EXPORTING
        " Here you pass the name of the DDIC
        ddic_type  ='ZTESTRAMACCESS_REQUEST_MESSAGE'
        " And here your XML in xstring format
        xml = lv_xstring
      IMPORTING
        " Alhtough this is technically optional (in the method signature), 
        " you need to pass a structure with the right type, or you will
        " get a short dump.
        abap_data = ls_abap_data
    ).
    " Now, in case of errors during the parsing, handle the
    " exceptions.
  CATCH cx_st_error INTO lx_st_error.
    " This is the most common exception in case of error here
    " you may want to use use the attribute XML_PATH
    " easily locate the error.
    WRITE lx_st_error->xml_path.
    NEW-LINE.
    lv_output_text = lx_st_error->get_text().
    WRITE lv_output_text.
    " You may also want to inspect the lx_st_error->previous
    " attribute for additional information, it is filled
    " in case of conversion errors.
  CATCH cx_transformation_error cx_proxy_fault INTO lx_root.
    " This is just to be sure...
    lv_output_text = lx_st_error->get_text().
    WRITE lv_output_text.
  ENDTRY.
The error handling is very basic in the example, you should adapt it to your own needs.

Tweaks

You may notice some false positive conversion errors for elements with xs:dateTime type. In this case it is because the DDIC structure is created assuming a specific variation (UTC, Local, Offset). You can change it in your proxy to a less specific value and (reactivate the proxy after changing):
proxy_dateTime.png

Limitations and other Technical Considerations

Inherited Limitations

This technique inherit all limitations that occur when you use a Service Consumer. For instance, when testing, I observed that if you have two elements of a certain XML Tag where only one is expected, this is ignored by the parser.
Another important limitation is the fact that the validation will stop at the first error, as the exceptions in this case are not resumable, so it is not possible to have a list with all errors in the XML.

Memory Consumption

In case of big XML files you may observe high memory consumption, as you could have many copies of the date (the XML, the XML in xstring format, the data in the DDIC structure).
In this case, try to explore the usage of the method XML_TO_ABAP instead of XML_XSTRING_TO_ABAP. In this case you can use one of the classes (or their sub-classes) realizing the interface IF_SXML_READER, which provide you more options for reading the content of the XML.

Final Words

This method has its limitations, but it is an easy to implement method to provide minimal XML Schema validation in ABAP reusing SAP Standard tools.
I hope this will be helpful to the community.
Special Thanks to Ram Manohar Tiwari, as this post is mostly an adaptation of his original post mentioned above.
Best Regards,
Guilherme.

Arbitrary value store (registry) for ABAP

$
0
0

One of the problems ABAP developers face from time to time is the need to store arbitrary values for processing. These values do not always justify creating a new table, and there is no convenient place to store such values. An example would be when some logic depends on a master data value (like a certain customer), or if the developer provides some tool that provides some customization options which need to be saved in an unstructured manner.

 

In cases where creating separate config tables are not warranted, developers often come up with creative ways to store values to avoid hard-coding them, such as writing entries to TVARV. However, this is not a very neat solution and sometimes leads to awkward workarounds for more specific sub-cases, like inventing conventions for putting composite keys into the name.

 

The solution I am presenting here (see below for a link to the source) consists of a hierarchical value store, not unlike the Microsoft Windows Registry. It allows the developer to retrieve and store values that cannot neatly be associated with some configuration and which does not warrant the creation of a new table.

 

The store is backed by the table INDX, which you will find standard in every ABAP installation and which is accessed conveniently with the statements EXPORT TO DATABASE and IMPORT FROM DATABASE. There is of course no reason not to use a custom table for this, but using INDX avoids you having to create yet another table; although if you are worried about keys colliding with other entries in there, you can simply use search-and-replace to specify your own table.

 

The solution is presented in the form of an API, the source for which is contained in an include. You could of course use the source-code view in the class editor in SE24 to add the class to the repository, but distributing the code as an include is just very convenient, and I am all for convenience.

 

Furthermore, the values are all stored as strings, though I imagine that if you consistently read to or write from fields of the same type, that should not be a problem. Maybe in a future version, I will look at adding a type specification for each value stored.

 

Additionally, there is some concurrency control in the form of optimistic locking, though I am not entirely sure whether there is a practical need for it. Adding new entries (keys) to the registry causes the new entry and the parent entry to be saved immediately, while values in each entry must be saved explicitly (although they will be saved automatically when making changes to the entries; I am still thinking about auto-saving all changes).

 

The second part of the solution is a registry browser/editor, which is similar to the RegEdit application on Windows, which allows an administrator/consultant to inspect the contents of the registry.

 

 

(The editor, like the library, is provided as a self-contained piece of source code that can be pasted into a report program, avoiding you having to create additional items in the repository. See below for the link to the source code).

 

 

 

Here is an example of how you could use the registry:

 

* Make the registry API available to our program

INCLUDE zlib_registry.

 

DATA: reg_root TYPEREF TO lcl_registry_entry.

DATA: reg_entry TYPEREF TO lcl_registry_entry.

DATA: lv_customer TYPE kunnr.

DATA: lv_run_date TYPE d.

DATA: lv_timestamp TYPETIMESTAMP.

 

START-OF-SELECTION.

 

* Get the root entry of the registry

  reg_root = lcl_registry_entry=>get_root().

 

* If we want to ensure, on startup, that a certain entry exists, we

* could do the following (e.g. in LOAD-OF-PROGRAM):

  reg_root->create_by_path('Sales/Enhancements/Process_XYZ').

 

* Retrieval of a specific entry. If we did not have the above line,

* we would have to check that the result of each call to GET_SUBENTRY( )

* to ensure it is bound.

  reg_entry = reg_root->get_subentry('Sales')->get_subentry('Enhancements')->get_subentry('Process_XYZ').

 

* Getting a specific value from the entry:

  lv_customer = reg_entry->get_value('ProcessCustomer').

 

* Writing values to the entry:

  lv_run_date = sy-datum.

  reg_entry->set_value(KEY='LastRunDate'VALUE= lv_run_date ).

  GET TIME STAMPFIELD lv_timestamp.

  reg_entry->set_value(KEY='LastRunDateTime'VALUE= lv_timestamp ).

 

* Saving the entry

  reg_entry->save().


I have given some thought to security; whereby you would want to prevent your section of the registry becoming inadvertently overwritten by another application or person. One approach is to lock down the registry and require applications to write their entries and provide an applicable UI, which is analogous to how Windows applications operate, e.g. when recording user settings. Another approach would be to extend the editor so that it can be run to access only a certain branch of the registry tree and give specific users access to that.

 

You can find the entire source code of the include for the library, as well as the registry editor and the example program at the following Gist on GitHub: https://gist.github.com/mydoghasworms/08ea60e95dd1fa90c90a

 

This article has been adapted from my original blog post here: Arbitrary value store (registry) for ABAP

How much "Comments" is too much?

$
0
0

All throughout my ABAP career in various organizations, I have come across many organization specific "Coding Standards Documents". And all of those documents had couple of things in common - Whenever you are making a change to existing code, mark your changes with comments stating

 

*Begin of changes by Juwin Thomas on 5/27/2015 ECDK909909

......

......

......

*End of changes by Juwin Thomas on 5/27/2015 ECDK909909

 

And I kept on wondering isn't this already handled by SAP's Version management? After 10 changes done by 10 different people, the commented lines in the program becomes more than actual executable code. Who reads these anyways? I haven't, I always go to version management, to see who did the changes, when and what was the change.

 

Another "Coding Standard" I have seen is - Do not delete any lines from the program, comment them instead. Isn't this similar to having dead code inside the program?

 

Has anyone felt the same way? Or, are there anyone here, who think something good is going to happen with all these comments in the code?

 

One of the arguments that I have heard from people who likes comments - especially code reviewers - is that, in production system, where there are no program versions maintained (generally), it becomes easy to understand the changes if there are comments. My question to them - is it too hard to open the development system and see the versions? Or, if they like to see the versions in production system, activate versioning in that system during transport. Why degrade the code readability by putting in all these comments?

 

SAP standard programs have been around for decades and have gone to regular maintenance by different developers. But, I hardly see any such comments in any of the SAP standard programs. I haven't worked at SAPLabs, so I don't know if they have such a Coding standard. If they don't, how do they manage their code reviewers?

 

Bored of writing these type of comments in my program, I developed a small utility program that I carry around with me, to different organizations. What I do is, whenever I have to do changes to an existing program, I write the code the way I want, without any change comments, and run it through my utility program when I am done. The utility program reads the current version and previous version of my changed program and inserts the comments wherever appropriate.

 

But, I still keep wondering, why do we need this practice? I feel that there should be a "Comments Standard Documents" also, in addition to the "Coding Standards Documents", to explain when & how to write comments.

I just heard SAP are trying to kill ABAP once again

$
0
0

Mastering SAP Technologies 2015 Melbourne

Day 1 – Sunday 24thMay

image001.png

Figure 1 – Conference Dustbins


This year the Mastering SAP Technologies conference in Melbourne was co-located with both the Mastering HR conference and the Mastering Finance conference. This meant there were 700 people there, which is an enormous amount for Australia, though of course we cannot hold a torch to the SAPPHIRE event in the USA where there are 40,000 delegates or whatever the exact number is.



1PM – SAP Inside Track


This is a free event so a lot of people who were not going to the conference rocked up including a big bunch of university students brought by SAP Mentor Tony DT.


How this works is a bit like speed dating – you have to jump from table to table with people you have not sat with before and all say who you are and what your business challenges are for five minutes, then run to the next table. During this bit you can get to talk in a small group about such challenges and everyone gets a say.


Thereafter all the challenges are written on a blackboard and the most common ones chosen for one hour or twenty minute sessions. In those sessions the group splits into enormous groups where one person totally dominates the conversation and a few others interject the odd comment while most people sit around listening.


As might be imagined popular subjects of interest were user interfaces and S/4 HANA. The major point I heard I am going to go into detail about and this all revolves around one of the students asking the question “should I learn ABAP?”.


ABAP – Dead Again!


You may recall that in 2001 ShaiAgassi was more or less in charge of SAP and wanted to kill off the ABAP language and replace it with Java. I believed this at the time and as I had only been programming for a few years was not even that attached to ABAP. However as history tells us nothing ever came of that initiative.


As far as I can tell – and I may have totally misinterpreted this (I hope I have) here is the latest position from SAP.


·        They want all us customer companies to move onto S/4 HANA in the Cloud


I hope that bit is not too contentious, not too much of a shock. Here comes the problem – as noted there and then it seems that as much as 50% of the screens that end users currently see are in fact custom screens written in ABAP. That sounds right to me, at my company it’s more like 100%.


So, in the cloud you cannot change the code. Nobody is going to put up with standard SAP – as we have seen – even standard SAP with lovely new UI5 screens. Everyone is always going to want to put their own stuff in, as they have always done with user exits and custom reports and bespoke applications for things specific to their company and so on.


How does SAP intend to square this circle? As I understand it from the talk at the Inside Track, the idea is that S/4 HANA will have API’s which custom applications can hook into, thus achieving the same sort of thing we have today.


How do you build such custom application, such custom code? In the HANA Cloud Platform (HCP).


OK, what language do you program things in using the HCP? The answer is - any language you want, so long as it is not ABAP.


That doesn’t sound good does it? I just wrote a book about the future of ABAP, SAP has put an enormous amount of innovation into the ABAP language in recent years, and now it is dead again? That can’t be right, surely?


It sounded to me like SAP itself as an organisation was still going to use ABAP itself, S/4 HANA would still be in ABAP, new SAP delivered innovations would still be written in ABAP, but we customer types cannot change the core of S/4 HANA and cannot add new applications in ABAP.


In other words SAP can still use ABAP to develop but the customers cannot because it is too good for the likes of us. Instead we have the minor task of re-writing everything we have written during the last 20 years in JavaScript or some such. No problem really, should only take us ten minutes.


When I phrase it like this do you think many organisations will jump at this idea like it is the best thing since sliced bread?


Possibly not, so I must have got this wrong. It is the only possible explanation. I must have been hearing things; SAP cannot possibly be suggesting such an approach. The funny thing was at that point I had not even started drinking. Even stranger when I asked other people throughout the conference they seemed to have heard the same message as well.


6:30PM – Jumpstart / Demo Jam


The Demo Jam is a contest where people have a limited time to demonstrate something new and exciting they have created using SAP technology, usually mixed with other technology.


This year it was all about bicycle power which ties in nicely with SAP’s current focus on sports and applications to monitor sports performance.


7:30PM Network Drinks

Figure 2.jpg

Figure 2 – Conference Food


Then it was networking time. The idea here is to talk to as many people as you can and make connections. The drinking is an aside and by no means the main focus.

Figure 3.png

Figure 3 – Networking


Day 2 – Monday 25thMay

Figure 4.png

Figure 4 – Conference Dustbin


SAP User Interface Strategy


This was a presentation from SAP and no huge surprises here. The idea is that UI5 is the go-to technology; everything else is a sort of bridging technology. Screen Personas is a sort of stop-gap measure, and NWBC is a way to see all the disparate UI technologies at once e.g. SAP GUI, Web Dynpro and UI5.


No-one at SAP is ever going to say that Web Dynpro or the SAP Portal is dead, especially the latter as people may maintenance fees for that. However reading between the lines I get the feeling that both are in the “Dodo / Dinosaur” basket.

Figure 5.jpg

Figure 5 – Lunch on Day 2


Netweaver Business Client – John Moy


Now it was time for two fantastic presenters, one after the other, again both talking about UI technology which was a big focus for this conference.


John Moy was talking about his experience implementing NWBC in real life. Some people at my organisation are mad keen to try this, and John made it look very impressive indeed. The killer is that really you have to be on a higher level than EHP5 to get the most of this, and that is what my company is on. We upgraded in December 2011 and that does not seem like too long ago to me but we are already miles behind and really cannot go through the effort of a major upgrade (which is what putting in an Enhancement pack is) again so soon.


By getting the most out of this technology I mean things like having side panels (so called CHIPS) appearing by standard SAP GUI transactions without having to modify such transactions.


In addition, Julie Plummer posted a blog on SCN the other day about the future of UI in SAP and (as always) the next version of NWBC coming out in 2016 looks like the important one, as it will have the Fiori Launchpad as the “start” menu.


UI5 – Graham Robinson


There is very little I can say here I have not said before – Graham’s presentations are very “hands on” with lots of live feeds from his system as opposed to static PowerPoint slides.


He noted all the “good parts” of UI5 like it’s usage of the MVC pattern and how it responded to running on different devices automatically.


The “worst part” was the he shock of SAP development organisations being dragged kicking and screaming into the modern world full of horrible things like agile development with rapid releases and having to deal with quirky new things like deploying software onto mobile devices.


Networking Time Once More

Figure 6.png

Figure 6 Networking

Day 3 – Tuesday 26thMay

Figure 7.png

Figure 7 – Conference Robotic Dustbin


UI5 at Australia Post


Australia Post keep losing things posted to me and it seems to take forever to post anything from one side of Sydney to the other, but that’s just my experience and I don’t think I can actually blame the IT department for that.


Anyway the IT department is clearly ahead of the curve ball when it comes to adopting new technology and this presentation was all about how they had adopted UI5 to replace some applications and as a basis for creating new ones.


They covered the technical basics, which are all over the SCN so I won’t re-iterate them here but the important things is how easy they found it to make the change once they had conquered the fear of doing something totally new. They certainly do not regret it.

They also stressed how fast new developments were on this platform.

Figure 8.jpg

Figure 8 – Conference Lunch Day 2

Latest Development Techniques #1

Alisdair Templeton gave a presentation which was a “grab bag” of things developers should be thinking about when writing programs.


This was really good, he had nowhere near enough time to cover everything he wanted but managed to pack a load of stuff in. I like it when people challenge such basic assumptions and one was “the fallacy of re-use”. All the IT articles keep on and on about re-use being the be-all and end-all of development but AT questioned whether this did not cause more problems than it solved by introducing dependencies all over the place,


Latest Development Techniques #2


Ben Patterson then gave a talk on the Business Objects Processing Framework (BOPF). I have experimented a lot with this recently, as it was part of my book, and I am glad this is getting some attention at conferences, as I believe this framework has a lot of potential.


Latest Development Techniques #3


John Patterson (no relation to Ben) then gave a talk about using “Grunt” to automate various development tasks whilst developing UI5 applications. This was also accompanied by a big screencamof the tool in action to make this a bit more real.


Why this is useful is a very difficult concept to grasp from an ABAP developer as lot of things needed in other programming languages are done “behind the scenes” in the ABAP environment. This is all lovely until you want to put applications on smart phones and things and then people really get outside of their comfort zone.


John Patterson may not be Stevie Winwood, but we did used to work together back in the year 2000 and this was the first time I have talked to him since.


Screen Personas


The saddest thing about the conference was the fact that Steve Rumsby from the University of Warwick could not make it over to talk about his experiences with Screen Personas. I was looking forward to talking and listening to him, my understanding is that he has made Screen Personas sing and dance and do a lot of things it is not supposed to be capable of.


Still, a chap from SAP gave this talk instead though hearing SAP say something about one of their products is never going to be as good as hearing a “customer” doing a “warts and all” story.


The main take-away is that really you need to wait till Screen Personas 3.0 is in general availability. I had heard it was often in the “too hard” basket to get Screen Personas 2.0 to combine multiple tabs and screens into one (a common requirement, one that the premium version of GUixt could do 16 years ago). In addition version 2.0 runs on “Silverlight’ which is a dying technology.


Elephant Whispering


The best presentation of the conference was the last one; Greg Taylor gave a talk about change management and a brilliant one at that.


Every so often he would distract the audience by showing them a funny video and by the time the video was over (only about two minutes) he was back on stage in a different costume, talking in a totally different accent. Or it could have been a succession of people who looked just like him.


Networking


Then after a bit more networking it was time to go home.

figure 9.png

Figure 9 – Networking


However sad that the end of one conference makes one they are just like buses – if you wait a while another one comes along. In the case of Australia the next one is the Australian SAP User Group (SAUG) conference in August. I am talking at that one about the latest developments in ABAP.


I wanted to talk about that very subject at this conference but as I mentioned earlier in this blog it appears ABAP is on its way out once again as a programmer language for us customer types.


Naturally if someone from SAP wants to tell me I have got totally the wrong end of the stick then I am all ears – in fact it will help me sleep better at night.


Cheersy Cheers


Paul

 

Number Ranges-Things to Keep in Mind

$
0
0

Enterprises which have a global presence do a lot of business across the world in various countries bound by the rules of that particular country.

Take an example of a typical manufacturing company

 

Its daily operations and business include Procurement of material, sales of goods and services, Product Manufacturing, Financial Transactions and a lot more to put on. SAP ERP system is the right solution for all these kinds of business transactions. Every business transaction is recorded in SAP system with a unique identifier (Some number or alphanumeric characters). The good thing about this is that we can make a choice of how the number range should start from, end with, whether it should be internal or external. It’s not only restricted to a particular entity like sales order/purchase order, it can be used  to configure for any kind of business transaction and even could be used to generate a unique sequence number for custom data(Like Z transactions to store data in Z tables etc.,)

 

However there should be careful monitoring on these number ranges and mere negligence would lead to dire consequences especially in Production system.

I have prepared some list of points where we should keep an eye in particular in Production Systems.

Important Repository objects to Keep in Mind while working with Number Ranges

 

SNRO  - Number Range Transaction Code

NRIV   - Number Range Intervals Database Table

SNUM –Number Range Object Maintenance

 

Function Modules

NUMBER_GET_BUFFER To Read Buffer information

NUMBER_GET_INFO                Number range: Provides information for a number range number

NUMBER_GET_NEXT                NUmber range: Assigns next free number

NUMBER_GET_NEXT_V1 Assigns next free number(s)

 

 

Buffering Problem

Here you can observe Main Memory buffering enabled for Material Number Range Object and Number of numbers in buffer as 10.This means that if you’re calling the number range object for the first time the starting number will be picked up(Say Material Number Range starting from 1000 to 2000).

Material Number 1000 will be created and subsequent numbers 1001, 1002,1003,1004,1005 up to 1010 will be stored in memory buffer. Upon calling the number range object for the next time the sequence that is present in the buffer will be used instead of querying backend database table.

This  could give a better performance but will lead to problems if someone erase the memory buffer  Clearing memory buffer can be done using SM56 and one should be able to understand about the consequences of this activity. This often leads to skipping of number ranges and we could end up with documents which are no longer in sequence

 

img1.png

 

   

Monitor Number Range utilization

If the number range reaches the upper limit and if it’s not changed before the next document gets created, system will raise an exception which would lead to SM13 update failure and subsequently ST22 dump like below. If we are performing the transaction on a foreground then system will be able to issue a warning when the number range utilization reaches the warning percentage.

Now Let’s Say the documents are created through a Batch Job, unless we monitor the system /incorporate auto alert mechanism to notify about this failure, we will not be aware of this issue.If it occurs in Production system and If we don’t take the necessary action it could lead to bad consequences and effects enterprise business as well

 

img2.jpg

img3.png

Interval Rolling

This feature given by SAP can be seen below as check box and this Functionality comes into picture when Number Range is completely exhausted. So, In continuation with the previous sub topic of number range utilization, Upon reaching the upper limit we can decide whether the document numbering starts from the beginning again by not enabling the check box. While this seems to be a pretty good option, care should be taken that previous documents already created must be archived/deleted periodically

Otherwise we would encounter the same ST22 Duplicate Records error which we discussed in the previous topic

 

img4.png

Keep an Eye on Number ranges for which changes are to be done in Golden client“000”

Generally we have restricted number limit for spools and if it gets used up we will be getting SPOOL_INTERAL_ERROR occurs and system will no longer be able to generate the spool.

The Spool number range should be changed only in golden client “000” and the changes done in any other client wouldn’t be effective

We need to change the SPO_NUM number range , set the upper limit to 99,999 and its good practice to delete the spools regularly so the system uses the number range from the beginning again once it reaches the upper limit 

The spool retention period by default is 8 days and spools can be deleted by using RSPO1041 which can be scheduled as a background job using SM37. Below link from help.sap will explain you further details in brief .

Please read through

http://help.sap.com/saphelp_nw74/helpdata/en/4e/a0576140a97118e10000000a42189e/content.htm

Standard SAP transaction which can be used to check the spool utilization is RZ20


img5.png

 

You can follow the below path RZ20 àSAP CCMS Monitor templatesàSpool system

You can observe the percentage utilization and  color legend showing as green which implies that “All is Well”

 

img6.png

 

 

 

Never Transport Number Ranges to Production System

Generally any SAP System has a typical landscape of 4 systems

  1. SandboxàDevelopmentàQualityàProduction

While Sandbox and Development systems are purely used for implementation of Repository objects and Unit testing from a developer perspective, Quality system is used for Business User Acceptance testing

The Data in quality system gets refreshed from that of production often but not on a daily basis. In Production Business transactions happen every day and in huge numbers and therefore the data is out-of-sync with a quality system. Even though we maintain the same number ranges across the two systems Initially ,the current number used will not always be the same .

Hence it’s not a good practice to transport number ranges to production. If done so system will trigger SM13 update errors and ST22 Dumps

 

 

Unauthorized Access to SNRO transaction

Granting access rights to unauthorized SAP users might lead to inconsistencies in number ranges if they aren’t sure of how to handle number ranges. Suppose if someone changes it by mistake , we can still retrieve the original series using change documents.

  1. This can be accessed using the next screen of SNRO (i.e. once you press the display button on initial screen) .Then follow the menu path GotoàNumber Ranges

Please Note We have an option “Change Documents  available on menu and on tool bar itself, but it only reads the change data of Number range Object attributes (Like Text change or Checkbox change).In order to see the Interval range change documents you need to follow the below path

From the change documents screen you’ll get to know the user , timestamp, old value and new value etc.,. This is the beauty of change documents in SAP

 

img7.png

img8.png

img9.png


Local exception classes in Global classes

$
0
0

Note: original of this article comes from my blog at oprsteny.com

 

In this article I'd like to present a solution how to implement a global class where all internalexceptions (exceptions raised by private methods only ⇒ always catched within the class itself) will be handled with its local exception class.

 

It is generally possible (and recommended) to use global exception classes but sometimes you can find a local exception class be more practical (no DDIC entries, no heavy coding/maintenance, ...).

 

I also wanted a simple solution supporting WHERE-USED functionality for all messages defined in SE91 which can be generated in my program's exceptions.

Exception messages in SE91

 

Sadly it is not possible to create a local exception class in a global class's "local definitions" section AND use it in the global class method's EXCEPTIONS section.

This is because such local exception class is NOT visible from the global point of view as will be demonstrated in the following few steps.

 

Navigation to the global class's local definitions & implementations:

Navigation to class's local definitions & implementations

 

Here follows our local exception class definition which I entered in the "Local Class Definitions/Types" section

 

CLASS lcx_exception DEFINITION INHERITING FROM cx_static_check.  PUBLIC SECTION.    METHODS:      constructor,      get_text REDEFINITION.    CLASS-DATA:
*     Statically accessible variable - the reason will be shown later      message TYPE bapiret2-message.  PRIVATE SECTION.    DATA:      mv_message TYPE bapiret2-message.
ENDCLASS.

 

I keep the local exception class implementation very simple:

 

CLASS lcx_exception IMPLEMENTATION.  METHOD constructor.    super->constructor( ).
*   Save the static variable to instance variable    me->mv_message = lcx_exception=>message.  ENDMETHOD.                     METHOD get_text.
*   Return message from the instance variable    result = me->mv_message.  ENDMETHOD.                   
ENDCLASS.

 

Now try to create new class method (called e.g. TEST_METHOD), which would like to raise exceptions of type LCX_EXCEPTION - you can see the system reacts with popup message saying that such exception class is NOT KNOWN!!!

 

...so the system offers to create a new GLOBAL exception class - but this is not what we wanted.

Local exceptions are not known

 

Therefore the class method must raise a generic exception (e.g. CX_STATIC_CHECK) or a global exception from which your local exception class is inheriting from.

Raising CX_STATIC_CHECK exception

 

In the caller method (called e.g. RUN_TEST) you handle the exception occurrence like it was of you local exception class type.


METHOD run_test.  DATA:    lr_my_exception     TYPE REF TO lcx_exception,    lr_static_exception TYPE REF TO cx_static_check,    lv_message          TYPE bapiret2-message.  TRY.          me->test_method( ).    
*   Catch my locally defined exception          CATCH lcx_exception INTO lr_my_exception.      lv_message = lr_my_exception->get_text( ).      WRITE lv_message.      
*   Catch any other exception of type CX_STATIC_CHECK     CATCH cx_static_check INTO lr_static_exception.      lv_message = lr_static_exception->get_text( ).      WRITE lv_message.  ENDTRY.
ENDMETHOD.

 

The code of the method TEST_METHOD being called would be like this:

 

METHOD test_method.  MESSAGE e001(ztest)    WITH 'ZCL_EXCEPTION_DEMO-TEST_METHOD'    INTO lcx_exception=>message.  RAISE EXCEPTION TYPE lcx_exception.
ENDMETHOD.


Important note:

In the above code please note the MESSAGE command.

Thanks to this command SAP system is able to find usage of the applied message with the WHERE-USED functionality from SE91

Message WHERE-USED

 

For testing purposes I created simple program that proves our solution:

 

REPORT  zr_test_local_exceptions.
DATA:  lr_demo TYPE REF TO zcl_exception_demo.

START-OF-SELECTION.
  CREATE OBJECT lr_demo.  lr_demo->run_test( ).

 

...and here comes the output:

Exception output

Definindo valores de parâmetro usando o Search Help Exit (Define parameter values ​​using the Search Help Exit)

$
0
0

     Essa é minha primeira postagem, bem simples, mais pode ajudar alguém.

     Qualquer correção ou dica é bem vinda.

     This is my first post, simple, but can help someone.

     Any correction or tip is welcome.


     Deixo aqui um pequeno e simples tutorial de como colocar valores fixos em um search help através das Search Help Exit, como na imagem abaixo:

     I leave here a small and simple tutorial on how to put fixed values ​​in a search help through the Search Help Exit, like the image below:

image 5.jpg

 

  • Criar um grupo de funções.
  • Create a function group.

          Ex.: ZGF_SHLP


  • Criar um modulo de função para o Search Help Exit.
  • Create a function module for Search Help Exit.

          Ex.: ZSHLP_EXIT


  • Em parâmetros de modificação da função, definir o seguinte:
  • In modification of parameters of the function, set the following:

image 1.jpg


  • Em parâmetros de tabela:
  • In table parameters:

image 2.jpg


  • No texto fonte da função:
  • In the source text of the function:

          No exemplo, para o campo COD_TAB virá por padrão os valores de 5.1.1. e 5.3.

          In the example, for COD_TAB field will by default the values ​​of 5.1.1 . and 5.3.

image 3.jpg

         Descomentando a linha (callcontrol-step = 'SELECT'.),não aparece a janela de seleção, já abre a seleção pronta, com os filtros definidos no código.

         Uncomment the line (CallControl - step = ' SELECT' .), does not appear the selection window, already opens the prompt selection with the filters defined in the code.


  • Na se11 dentro do search help, colocar a função criada no campo Exit ajud.pesquisa.
  • In SE11 within the search help , put the function created in Search help exit field.

image 4.jpg

 

     E é isso. Espero que tenha ajudado.

     And that's it. I hope it help.

Kill delayed work processses automatically

$
0
0

Kill delayed work processses automatically through batch

 

Generally we use TCode SM50 to view running processes and delete the long running processes manually. This is an attempt to make this manual process into automated process.


SAP has given a function module'TH_SERVER_LIST'which gives all the details of the SAP Application Servers on the network.


SAP has given a function module  'TH_SYSTEMWIDE_WPINFO'which gives all the details of work processes on given SAP Application Server.


We can filter the details by which type of work processes shall be stopped based on the time it has taken.


We have to take those process identifications (PID) and SAP Application Server and call the function'ThWpInfo'to stop the process permanently.


This report could be scheduled so that it runs on its own and stop the long running processes automatically.


It is to be noted that this Program uses unreleased function modules and kernel calls, and so is to be used at your own risk.

 

Complete source is as follows:


I do hope it helps.


I thank Mr. Matthew Billinghamfor his valuable suggestions and guidance.


REPORT  ZR_KILL_PROCESS.

DATA: itab LIKE STANDARD TABLE OF WPINFO,
      wa
LIKE WPINFO,
      delay_seconds
TYPE i VALUE 900.

DATA: BEGIN OF TY
 
INCLUDE STRUCTURE MSXXLIST_V6.
DATA: END OF TY.

DATA: itab_as LIKE STANDARD TABLE OF TY,
      wa_as
LIKE TY.

CONSTANTS: opcode_wp_stop TYPE x VALUE 2.

CALL FUNCTION 'TH_SERVER_LIST'
TABLES
LIST          
= itab_as
EXCEPTIONS
NO_SERVER_LIST
= 1
OTHERS         = 2.

LOOP AT itab_as INTO wa_as.

  CALL FUNCTION 'TH_WPINFO'
  
EXPORTING
   SRVNAME       
= wa_as-name
  
TABLES
   WPLIST        
= itab
  
EXCEPTIONS
  
OTHERS = 1.

 
LOOP AT ITAB INTO WA.

  
IF WA-WP_TYP = 'DIA' AND WA-WP_STATUS = 'Running' AND WA-WP_ELTIME GT delay_seconds.

    C
ALL 'ThWpInfo'
   
ID 'OPCODE' FIELD opcode_wp_stop
   
ID 'SERVER' FIELD wa_as-name
   
ID 'PID' FIELD wa-wp_pid.

  
ENDIF.
 
ENDLOOP.

ENDLOOP.

Mandatory Views in Material Master - MM01 and MM02

$
0
0


'How to make material master views mandatory ?'. This is one of the often asked questions on SCN. I also came across a similar requirement and after going through several posts in got an idea on how to achieve this. However , the steps to achieve the requirement were not properly documented and were spread across multiple posts which prompted me to write this blog post.

 

 

Requirement:

 

  • Make Basic Data 1 , Basic Data 2 and Classification Data Views as mandatory. An error message should be triggered if the user tries to save a material without maintaining these views.
  • Make all material classes and material characteristics mandatory.
  • This should work for both MM01 and MM02. Thus a user should get an error is a new material  is being created without mandatory views through MM01 or changes are being done to an existing material not having the mandatory views.

 

 

Approach:

 

  • We will make use of  customer-exit EXIT_SAPLMGMU_001 to write our logic.

  • Function module MAIN_PARAMETER_GET will be used to get the views currently being maintained or views already present for a material ( material extension or change scenario).

  • CLAP_DDB_ALLOCATION_FR_BUFFER will be used to read the material classes in buffer.

  • CLAP_DDB_GET_BUFFER_PARAMS to read the material characteristic values in buffer.

  • CLAF_CLASSIFICATION_OF_OBJECTS to get the material characteristics from database.
  • Further explanations are given below as a part of comments in code snippets for better understanding.



Check for mandatory views:

 

Function module MAIN_PARAMETER_GET will give the list of material master tables that are going to be updated. The tables parameter MTAB is filled with the list of used tables at run time.

 

This table has 2 fields which we need to read to get the views being maintained.

 

MTAB-BISTSTAT will have the list of views that already exist for a material.

 

MTAB-PFSTATUS will have the list of views currently being maintained.

 

 

Code:

 

CALL FUNCTION 'MAIN_PARAMETER_GET'

      TABLES

        mtab = lt_mtab.

    IF sy-subrc EQ 0.

*--Check the existing view and current views being created.Read for table MARA. The status for  Basic Data is 'K' and classification data is 'C'.

      CLEAR ls_mtab.

      READ TABLE lt_mtab INTO ls_mtab WITH KEY tbnam = 'MARA'

      IF sy-subrc EQ 0.

*--Check for basic view

        IF ls_mtab-biststat CA 'K' OR ls_mtab-pfstatus CA 'K'.

*--Basic Data View is present ( Do nothing )

        ELSE.

*--Give error message

         MESSAGE 'Material Master Views Basic Data 1 / Basic Data 2 are mandatory' Type 'E'.

        ENDIF.

*--Check for classification view

        IF ls_mtab-biststat CA 'C' OR ls_mtab-pfstatus CA 'C'.

*--Classification Data View is present ( Do nothing )

       ELSE.

          MESSAGE 'Material Master View Classification is mandatory' type 'E'.

        ENDIF.

      ENDIF.

    ENDIF.

 

 

Check for mandatory classes and characteristic values in Classification View:


If user is currently maintaining classification view ( MM01 material create ) then the check should be done from the buffer values else in case of material extension or change ( MM02 ) the values should be read from database to check for completeness.


Code:


Check buffer values ( material create - MM01 ).

*--If classification data tab is currently being maintained


IF ls_mtab-pfstatus CA lc_c.


*--Check for material classes


      CLEAR lv_objkey.

 

*--Pass material number in object key


       lv_objkey = cmara-matnr.

*--Read buffer values


       CALL FUNCTION 'CLAP_DDB_ALLOCATION_FR_BUFFER'
EXPORTING
      object                       = lv_objkey
      classtype                  = '001'
      ptable                       = 'MARA'
TABLES
      t_allocations            =  lt_allocations
EXCEPTIONS
      no_allocations_in_buffer = 1
      class_not_in_buffer        = 2
      allocation_not_in_buffer = 3
      missing_parameter        = 4
      others                           = 5.

*--Delete the values in lt_allocations where VBKZ is 'D' (deleted allocation )
*--So after delete we will only have unchanged values ( VBKZ ' ') or
*--newly added ( VBKZ 'U' ).This ensures that we take care of allocations
*--being deleted in current maintenance

DELETE lt_allocations WHERE vbkz EQ 'D'. 


*--Now we can check in table lt_allocations whether all classes are present or not

*--Loop at the table and check if all the  required material classes  are present.

*--If required classes are not maintained give error message else proceed.

 

*--Check for mandatory classifications

 

CALL FUNCTION 'CLAP_DDB_GET_BUFFER_PARAMS'
EXPORTING
      object    
= lv_objkey
      classtype 
= '001'
      obtab     
= 'MARA'
TABLES
      e_ausp_tab
= lt_ausp.

*--Delete the values in lt_ausp where STATU is 'L' ( old values )
*--So after delete we will only have unchanged values ( STATU ' ') or
*--new changed values ( STATU 'H' ).This ensures there is only 1 entry
*--per classification


DELETE lt_ausp WHERE statu EQ 'L'


*--Now we can check in table lt_ausp whether values for all material classifications are present

*--or not. If not the trigger error message.


 

Code:

 

Check data base values ( MM01 - material extension scenario where classification tab is not maintained currently OR MM02 - material change ).

 

 

*--If classification tab is not being maintained currently


ELSE.


*--Pass material as object key

 


lv_objkey = cmara-matnr.


CALL FUNCTION 'CLAF_CLASSIFICATION_OF_OBJECTS'
EXPORTING
      classtype          = ‘001’
      object               = lv_objkey
TABLES
      t_class             = lt_class
      t_objectdata      = lt_objectdata
EXCEPTIONS
      no_classification    = 1
      no_classtypes        = 2
      invalid_class_type  = 3
      OTHERS              = 4.

*--Check for mandatory classes

*--Check in table LT_CLASS if all the material classes are present. If not then trigger an error message.

 

 

*--Check for characteristic values


*--Fetch the characteristic details from CABN

*--Check if values exist for each characteristic in table LT_OBJECTDATA.

*--To determine if a value for characteristic has been given compare with character ‘?’

*--If values are not present  CLAF_CLASSIFICATION_OF_OBJECTS returns '?' else it can be blank or some value.

 


ENDIF.



Execution Snapshots:

 

Go to MM01 to create material:

 

1.png

 

Enter material number , industry sector and material type. Select views to be maintained. Do not select Basic Data views.

 

2.png

 

Enter data on views selected:

 

3.png

 

4.png

 

Save the material.

 

5.png

Error message is triggered.

 

 

NOTE:

 

  • The class type for material class is '001'.
  • Material classes and characteristics related data can be fetched from tables KLAH ,KSML.
  • Characteristic details can also be fetched from table CABN.

ABAP Trapdoors: CALL TRANSFORMATION id(ontwork)

$
0
0

Welcome to another ABAP Trapdoors article - the first one posted using my new SCN user. If you are intersted in the older articles, you can find a link list at the bottom of this post.

 

I've been using asXML for some years now as a convenient way to serialize and deserialize arbitrary ABAP data structures. Some time ago, I learned about IF_SERIALIZABLE_OBJECT and its use to include class instances (aka objects) in an asXML representation as well. A few days ago, I decided to use this technique in a current development project. At the same time, I was trying to use CL_DEMO_OUTPUT_STREAM instead of classic lists as suggested by the online documentation, and since I was supposedly familiar with the basics of using transformations, I focused rather on the usage of this new output technology. I hacked together a small demo programm like this one:

 

REPORT z_test_serialize.
CLASS lcl_serializable_thingy DEFINITION CREATE PUBLIC.  PUBLIC SECTION.    INTERFACES if_serializable_object.    METHODS constructor      IMPORTING        i_foo TYPE string.    METHODS get_foo      RETURNING        VALUE(r_foo) TYPE string.  PRIVATE SECTION.    DATA g_foo TYPE string.
ENDCLASS.
CLASS lcl_serializable_thingy IMPLEMENTATION.  METHOD constructor.    g_foo = i_foo.  ENDMETHOD.  METHOD get_foo.    r_foo = g_foo.  ENDMETHOD.
ENDCLASS.
CLASS lcl_main DEFINITION CREATE PRIVATE.  PUBLIC SECTION.    CLASS-METHODS run.
ENDCLASS.
CLASS lcl_main IMPLEMENTATION.  METHOD run.    DATA: lr_stream        TYPE REF TO cl_demo_output_stream,          l_foo_in        TYPE string,          lr_first_thingy  TYPE REF TO lcl_serializable_thingy,          l_xml_data      TYPE string,          lr_second_thingy TYPE REF TO lcl_serializable_thingy,          l_foo_out        TYPE string.    lr_stream = cl_demo_output_stream=>open( ).    SET HANDLER cl_demo_output_html=>handle_output FOR lr_stream.    lr_stream->write_text( iv_text = 'XML Serialization of ABAP Objects Instances'                            iv_format = if_demo_output_formats=>heading                            iv_level  = 1 ).    l_foo_in = |Hello, this is Foo Bar calling from { sy-sysid } client { sy-mandt }.|.    lr_stream->write_data( iv_name  = 'Input Data'                          ia_value  = l_foo_in                          iv_format = if_demo_output_formats=>nonprop ).    CREATE OBJECT lr_first_thingy      EXPORTING        i_foo = l_foo_in.    CALL TRANSFORMATION id      SOURCE instance = lr_first_thingy      RESULT xml      = l_xml_data.    lr_stream->write_data( iv_name  = 'XML Serialization'                          ia_value  = l_xml_data                          iv_format = if_demo_output_formats=>nonprop ).    CALL TRANSFORMATION id      SOURCE xml      = l_xml_data      RESULT instance = lr_second_thingy.    l_foo_out = lr_second_thingy->get_foo( ).    lr_stream->write_data( iv_name  = 'Output Data'                          ia_value  = l_foo_out                          iv_format = if_demo_output_formats=>nonprop ).    lr_stream->close( ).  ENDMETHOD.
ENDCLASS.
START-OF-SELECTION.  lcl_main=>run( ).

Instead of the expected output (some text, an XML representation of the instance and the same text again), I got - a shortdump. The reference lr_second_thingy was not set after the second transformation - so the deserialization must somehow be broken, right? The debugger quickly revealed that the string variable that was supposed to contain the serialized instance was empty - so it's the serialization that must be broken, then, and not the deserialization? Well, they both are, in a way. To cut straight to the point, here is the faulty code:

 

    CALL TRANSFORMATION id      SOURCE instance = lr_first_thingy      RESULT xml      = l_xml_data.    lr_stream->write_data( iv_name  = 'XML Serialization'                          ia_value  = l_xml_data                          iv_format = if_demo_output_formats=>nonprop ).    CALL TRANSFORMATION id      SOURCE xml      = l_xml_data      RESULT instance = lr_second_thingy.

And here is the corrected version:

 

    CALL TRANSFORMATION id      SOURCE instance = lr_first_thingy      RESULT XML        l_xml_data.    lr_stream->write_data( iv_name  = 'XML Serialization'                          ia_value  = l_xml_data                          iv_format = if_demo_output_formats=>nonprop ).    CALL TRANSFORMATION id      SOURCE XML        l_xml_data      RESULT instance = lr_second_thingy.

Yup, the difference is a single character - or two characters in this case. Without the equals sign, XML is treated as a keyword to denote a variable containing the raw XML data. With the equals sign, something else happens that I have yet to find a sensible and practical use for - at least when used with the identity transformation. You can spot this issue if you use the pretty printer to convert the keywords to upper case - and if you notice the tiny difference between xml and XML.

 

Older ABAP Trapdoors articles

Definindo valores de parâmetro usando o Search Help Exit

$
0
0

     Essa é minha primeira postagem, bem simples, mais pode ajudar alguém.

     Qualquer correção ou dica é bem vinda.

 

     Deixo aqui um pequeno e simples tutorial de como colocar valores fixos em um search help através das Search Help Exit, como na imagem abaixo:

image 5.jpg

 

  • Criar um grupo de funções.

          Ex.: ZGF_SHLP


  • Criar um modulo de função para o Search Help Exit.

          Ex.: ZSHLP_EXIT


  • Em parâmetros de modificação da função, definir o seguinte:

image 1.jpg


  • Em parâmetros de tabela:

image 2.jpg


  • No texto fonte da função:

          No exemplo, para o campo COD_TAB virá por padrão os valores de 5.1.1. e 5.3.

image 3.jpg

         Descomentando a linha (callcontrol-step = 'SELECT'.),não aparece a janela de seleção, já abre a seleção pronta, com os filtros definidos no código.

  • Na se11 dentro do search help, colocar a função criada no campo Exit ajud.pesquisa.

image 4.jpg

 

     E é isso. Espero que tenha ajudado.

XSLT: how to get source file for debugging on "call transformation"

$
0
0

Hi ABAP XML experts,
some days ago I started a discussion on SCN and asked about debugging in ABAP/XSLT. I am new on handling XLST by asXML, not by the ST, simple transformation. In addition to this I am not an XSLT expert. I did a lot of simple transformations but now, I had to use a real XSLT program. The SCN question of my problem wasn't unfortunately answered. Have a short look ...

 

http://scn.sap.com/thread/3751547

 

I parallel I tried to find out the solution by myself of course.  After some very short nights to sleep, a lot of coffee, many open books, blogs and faqs I found the easy solution. Experts will think "were is the problem ?" but I lost a lot of time for this.


To earn my badget on blogging I decided to share my new knowledge to this issue. I hope that I can help someone someday in the same situation with my blog. If so, it would be nice to give a little reply on this. This is my first blog, so I hopefully do this in a right way and you will enjoy this !

 

What is the matter ? Debugging on XSLT !

 

If you try to export complex ABAP structures via XSLT to XML, the writing of the XSLT rules is not always easy. If you have internal tables referenced into internal tables this will cost all your attention to rule this correctly. In my case the ABAP structures had in addition many content realized by dynamic structures depending on the object you want to export.

 

After writing my XSLT, drinking the last rest of my cold coffee, putting all XLST books away from my desk, this was the time to do the first run on the transformation ... and of course it failed 

 

This was the time to check the XSLT code by the XSLT debugger within transaction XSLT_TOOL. I realized that I need the "exact" source XML for debugging. There is no way (I think) on a ABAP 7.00 system to do this directly while debugging the ABAP code and jumping to the XSTL debugger.


Lets start how to solve this ...

 

Step 1: define the ABAP side program to use a transformation.
In my case I created a little table with references to my internal ABAP tables or structures.

 

DATA: lt_source TYPE abap_trans_srcbind_tab,      ls_source TYPE abap_trans_resbind,       lv_xmlstr TYPE xstring,      .....

Then the table must be filled with content to export:

 

* add table 1  GET REFERENCE OF lt_my_table1 INTO ls_source-value.  ls_source-name = 'my_table1'.  APPEND ls_source TO lt_source.
* add structure 1  GET REFERENCE OF lt_my_structure1 INTO ls_source-value.  ls_source-name = 'my_structure1'.  APPEND ls_source TO lt_source.
* add table 2  GET REFERENCE OF lt_my_table1 INTO ls_source-value.  ls_source-name = 'my_table2'.
 APPEND ls_source TO lt_source.

 

You can see, that this is a mixture of tables and structures and tables, again.


Step 2: then we have to use the transformation

Create the call of the transformation, handled by a try-catch-structure.

 

  TRY.        CALL TRANSFORMATION zmy_complex_transformation            SOURCE (lt_source)            RESULT XML lv_xmlstr.    CATCH cx_root.      "error handling ...  ENDTRY.  "then follows implementation to store this lv_xmlstr to a file XML file

 

Step 3: after this we can develop the XSLT transformation programm 'zmy_complex_transformation'

This is just a simple frame, if you create a new XSLT program

<xsl:transform xmlns:xsl="http://www.w3.org/1999/XSL/Transform" version="1.0">  <xsl:output encoding="iso-8859-1" indent="yes" method="xml" version="1.0"/>  <xsl:strip-space elements="*"/>  <xsl:template match="/">  <!-- ************* here is my specific code *************** -->  </xsl:template></xsl:transform>


Step 4: ready ?

No ... this was just the framework to export ABAP data via XSLT to XML. While development of XSTL program you will have a lot of exceptions type of CX_XSLT_ ... even in the following code processing.


No we have to talk about how to get the ABAP exact output of XML ! The situation is shown in the diagram of the SAP help very well. In our case we want to do a red marked serialization from ABAP via asXML (by using a XSTL) to XML.

 

serialization.JPG

This was unclear to me a long time. I did not realize, that there is a real transformation already given named "ID", delivered by SAP to do this.

So lets have a look to this delivered transformation:

ID.JPG

 

XSLT experts will have some fun on this, because, the code is rather easy and will pass everything without any mapping. But if you are not very familiar, you will have thousands questions first

 

 

So this was the secret and the solution for my problem ...

To get an output I modified the code of step 2 like this:

 

DATA: lv_debug TYPE flag,      lo_doc TYPE REF TO cl_xml_document,      lv_subrc TYPE sy-subrc.
...      IF lv_debug IS NOT INITIAL. "<<<<<< change manually to X while debugging        CALL TRANSFORMATION id "SAP standard transformation ... let's pass everything without any mapping            SOURCE (lt_source)            RESULT XML lv_xmlstr.        CLEAR lo_doc.        CREATE OBJECT lo_doc.
* load hex string to XML document ...        CALL METHOD lo_doc->parse_xstring          EXPORTING            stream  = lv_xmlstr          RECEIVING            retcode = lv_subrc.
* ... and do an output to the local filesystem        CALL METHOD lo_doc->export_to_file          EXPORTING            filename = 'c:\temp\input.xml'          RECEIVING            retcode  = lv_subrc.      ELSE.        CALL TRANSFORMATION zca_tr_export_list            SOURCE (lt_source)            RESULT XML lv_xmlstr.      ENDIF.

Now we have the possibility to create an XML with the direct APAB output to get an better imagination of the data structures. After running this code we will have an XML file on the specified path. I called this file input file, because it's the input of our transformation.

 

Then we can go to the XSLT debugger by using this XML file for the XSLT transformation:

 

XSLT Debugger.JPG

 

Within the debugger you can exactly step trough the XSLT code and see what's going wrong with it ... or hopefully not.

 

Debugger in action.JPG

 

That's all and happy debugging

I hope you had some fun and can use these information.

 

Ahhh ... I forgot one interesting code gift.


One additional note:

 

You can also implement a nice little XML viewer in your code to get the result ... if you get or need one. So use this code to get a preview:

  IF lv_xmlstr IS NOT INITIAL.    CALL FUNCTION 'DISPLAY_XML_STRING'      EXPORTING        xml_string            = lv_xmlstr
*   TITLE                 =
*   STARTING_X            = 5
*   STARTING_Y            = 5     EXCEPTIONS       no_xml_document       = 1       OTHERS                = 2.  ENDIF.

 

The XML viewer looks like this short example:

XML viewer.jpg

 

Regards,

Markus


Jeft Join Right Table Selection Limintation Walk Around

$
0
0


Prior to ABAP 7.4 it was not possible to use right table fields in the WHERE clause of LEFT OUTER JOINs. It required to code multiple SQL selections and merge data in ABAP. In case of right table constant selection the limitation can be walk arounded by wrapping selection into view.

For example, for each sales order item it is required to get ship-to from both header and item. Coding right table selection creates a syntax error.

Z_DEMO_1.jpg

Lets wrap right table selection into ZVBPA_WE view.

ZVBPA_WE.jpg

Using ZVBPA_WE view will do the trick.

Z_DEMO_2.jpg

Lets run the program and make sure it selects the data as expected. Yes, it does.

WT_DATA.jpg

A Small tip to get all transparent tables used in ABAP code

$
0
0

If you need a list of all transparent tables used in a given ABAP class ( or function module, objects which belong to a given transport request, etc ) for analysis usage, you could follow the tips below, it is very simple but efficient.

 

Suppose you need to scan ABAP class CL_CRM_OPPORTUNITY_IMPL to find out all transparent tables used by this class.

 

1. use tcode SCI, create a new check variant:

clipboard1.png

Enable "Table Names from SELECT statements" and save variant.


clipboard2.png

2. create a new inspection:

clipboard3.png

Specify the class to be scanned, and load the check variant created in step one, then execute the inspection.

clipboard4.png

3. Once inspection is done, you see a green light and could get the inspection result by clicking button below:

clipboard5.png

All accessed tables are listed there:

clipboard6.png

Double click on each entry and the ABAP code will automatically be opened. Quite easy, isn't it?

 

Summary

 

As we know that the code inspection is done statically by scanning source code so any other tables which are accessed dynamically by dynamic SQL in the runtime will not appear in the scan result of code inspector. If you need to get the COMPLETE list of the tables involved within a given part of ABAP codes, it is recommended to use transaction code ST05, ST12 or SAT to trace the scenario in the runtime.

 

Further reading

 

1. ABAP Code inspector is far more than a static code scanner but in my opinion a powerful weapon which is for every ABAPer worth adding it to your toolbox. For more extensive usage on it to make your life easier, please read this document Useful tips regarding ABAP code inspector that you may not know .


2. Besides the approach to get accessed table list introduced in this blog, there is another approach using transaction code SQF, which is also done based on static code scan. For details please read ABAP static analysis tool SQF.

Customizing Code Inspector

$
0
0

Introduction:

 

  Precaution is always better than cure!! Although this is an old thought, has very special meaning and same I thought to apply while working as developer in SAP ABAP.

In spite of facing problems with development after releasing them in production why not to review the code in Development environment with less manual intervention and high accuracy. While exploring to the various checks provided by SAP I came across a very powerful code review tool Code Inspector, and it was what I was searching for.

 

About Code Inspector:

 

  Code Inspector is a powerful tool provided by SAP to review the objects developed through following Tcode - SE11, SE24, SE37, SE38, or SE80. Code Inspector is mainly for the static analysis of ABAP programs and other SAP repository objects.

Code Inspector provides the detail analysis with appropriate message and line number.

 

How to customize Code Inspector:

 

Code Inspector can be customized by following steps:

  • Create the code inspector variant in SCI.
  • Make the variant as default variant.
  • Send the result of code inspector in Email.

 

Creating Code Inspector Variant

   

       Code Inspector variant is created with tcode SCI. When you create the variant you have the option to select the checks as per your requirement and standards:

 

 

1.      Create the global variant in SCI.

  1.png

2.  Customize the checks as per user requirement by creating this variant. It might possible that all the SAP provided classes has not been selected. You can add extra checks by selecting the required classes this can be done as:

            SCI -> Goto ->Management of -> Test Console

2.png

       From this list check the class that you want to make part of your variant and save.

 

     3. While creating variant select the checks you want to perform and provide with necessary input parameters. Information of the checks can be  

          obtained by clicking the  information icon as:

3.jpg

 

     4. Expand each of the check categories to view all the sub category checks. Parameters can be set according to your requirement. For example- In

          Programming conventions category , for check naming convention , you can set the syntax or convention check as per your project standard as:

4.png

 

       Similarly other input parameters can be set for different checks.

 

Creating Your Own Checks

 

   There might be requirement where you want to add your checks as per your need and that check is not provided by SAP defined class. Don’t worry there is everything in SAP you just need to explore it.

   This can be achieved by creating own class and writing your own rule. To know more follow blog:

code-inspector--how-to-create-a-new-check.

 

Making Variant as Default Variant

 

  Instead of executing the code with your own variant in SCI, you can make your created variant as default. That is every time you check the code inspector your variant will get executed. This can be done as:

Go to transaction 'SE16', at the table 'SCICHKV_ALTER', set the item DEFAULT flag and click on EDIT BUTTON.

Change the field 'CHECKVNAME_NEW', 'RESPONSIBL', 'CREADATE' and click on SAVE BUTTON.

 

Sending Result in Email

 

  You can send the result of the code inspector in Email. Execute the code inspector. At the result screen, click the result list option, on result list screen click the mail recipient where you can add the recipient mail and send the result.

  You can customize code inspector as per your need by further exploring it and thus you took precaution of your code before it could raise any harm.

 

     

          

How SAP Jam provides value to IT: An interview with Daisy Hernandez

$
0
0

daisy_hernandez.jpgDaisy Hernandez is vice president of enterprise social software at SAP, leading the product management team for SAP Jam. The recent 1505 release of SAP Jam includes enhancements that make life better for IT, and to find out more, I spoke to her.

 

What’s new in release 1505 that provides value to IT?

 

There are two main areas that we’ve improved: First, it’s now easier for IT to use SAP Jam to manage their own work. Second, SAP Jam now helps IT serve stakeholders in their enterprise better.

 

To help IT manage their own work, we’ve created two new work patterns.

 

The Issue Escalation work pattern makes it easy to pull tickets from a service request system into SAP Jam, where IT can then pull in experts from the right teams to help resolve a tough issue. Even if those other experts aren’t in IT and don’t have access to the service request system, they’ll be able to communicate and solve problems as a team with IT in SAP Jam – in context of the most up-to-date information from the service request system.

 

The IT Projects work pattern is designed for projects that IT is tasked with or funded for. If you’re developing a new application, rolling out a new tool, or upgrading an application, this work pattern will make it easy to track requirements and enhancements as the team works together on the project.

 

"To help IT better serve their stakeholders, we’ve added capabilities that make it possible to deliver seamless integrations in more ways."

To help IT better serve their stakeholders, we’ve added capabilities that make it possible to deliver seamless integrations in more ways.

 

For instance, we’ve created what we call an activity hub: It’s a desktop application that allows users to pull together an aggregate view of feeds across multiple enterprise social networks. Instead of having to log into to more than one social network, they’ll have a single place where they can see all activity.

 

There’s also a new desktop file sync client that’s perfect for employees who travel frequently and need to have the latest versions of documents available, even when they don’t have internet access. For instance, salespeople who visit a prospect – but can’t connect to wireless – will still be able to access and deliver their presentations.

 

Because SAP Jam supports OpenSocial, it means IT can integrate even more capabilities from other tools with SAP Jam, opening up a vast array of possibilities. Support for Google spreadsheets, or even web conferencing, could be added using OpenSocial.

 

Support for mobile users has been expanded. SAP Jam now supports Android and includes a cross-section of features that build on its enterprise readiness: Secure storage, passcode security, and mobile auditing.

 

Finally, we’ve improved the onboarding experience for company admins. It’s much easier to get started – checklists and wizards within SAP Jam provide information about the tools available to company admins and let them choose which tools to support.

 

How does this release build on what’s already in SAP Jam?

 

It’s simple: SAP Jam has always been about providing collaboration where it matters – within the business processes that drive results for your company. These new capabilities continue to support that strategy by further connecting people, content, data, and processes so they can get work done.

 

The new work patterns for IT build on ones we’ve had in market that have already been supporting IT: The Onboarding work pattern helps IT ramp up new hires. The Help and Support work pattern helps them answer questions in a forum that makes the answers available to everyone in the company. The Planning and Implementation work pattern is perfect for smaller projects that don’t have multiple phases.

 

In turn, the integrations build on many that have already provided value to IT, like integrations for Microsoft Outlook, Microsoft Office, and other productivity applications; integrations for search technologies that support federated search; and integrations for multiple content management systems.

 

What can people who are in IT do to learn more about how SAP Jam can provide value?

 

The free developer edition of SAP Jam is a great place to start. That’s a place where you can work on customizations and try out the work patterns by integrating service request applications. You can learn more about it and register for it here.

 

I’d also recommend reading the release notes and developer admin guide– so that you can get familiar with all the new integration possibilities.

 

To try out the activity hub and desktop file sync, you can ask your customer support representative for details. And for even more information about this release, please reach out to your account or service representative, and they’d be happy to help.

Dangerous modifications II - why you should never execute a LOOP on internal tables XVBRK and XVBRP

$
0
0

One of the more common mistakes I came across again and again in my own work as an SAP Support Engineer is caused by modifications in user-exits. In the include RV60AFZZ (transaction SE38) several exits are available, which are executed during invoice creation. Of those, two are quite popular, which are relevant to influence the pricing functionality. Everybody does have some own requirements in the pricing area.

 

In FORM USEREXIT_PRICING_PREPARE_TKOMK additional custom-own fields can be filled for TKOMK, which is the header level.


And the user-exit for the item level (TKOMP) is FORM USEREXIT_PRICING_PREPARE_TKOMP.

 

Both are called at the start of the pricing processing in include  LV60AA58, FORM PREISFINDUNG_VORBEREITEN. First the header one, thereafter the one of the item fields at the very end:


351 PERFORM userexit_pricing_prepare_tkomk.

....

785  PERFORMuserexit_pricing_prepare_tkomp.


Now the usual way here is to simply fill own z-fields as is suggested:


TKOMK-zzfield = xxxx-zzfield2.


But what happens if you perform here a LOOP command on the internal table XVBRK? Then no errror message will be the result, as in my previous post , but instead the created invoice could be inconsistent.


In a recent case, which I investigated, two deliveries, with entirely different payers, had been created combined together in transaction VF01. In the TKOMK user-exit such a LOOP command was executed:


LOOP_XVBRK.gif


 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Now in this case an invoice split has to be performed, due to different partner data, and in XVBRK two entries exist. At the point before the LOOP command the second invoice is processed by the system and visible in the debugger as the currently processed line.

 

After the LOOP is executed, this line is changed, and now instead of the second invoice, the first one is the one to be processed further:

 

AFTER_LOOP.gif

 

The undesired result of this manipulation: the second invoice will get the wrong partner data! It will receive the same partners as the first invoice!

So such a direct LOOP must be avoided. If data of XVBRK should be used, then a custom own table ZVBRK would have to be used, and the data moved to there. Then on the Z-table the LOOP will not cause a change in the processing line of the XVBRK.

 

In other cases, when the LOOP is performed on XVBRP, in the other exit USEREXIT_PRICING_PREPARE_TKOMP, then error message VF004 "Reference document & & (Error during INSERT)" could be the result, if more than one item will have to be invoiced.


Again with the LOOP command the current processing line is changed, and then the inserting of the second item will fail, because the system thinks it is the first one, which is already present in XVBRP:


VF004.gif

 

So my suggestion would be to strictly refrain from such LOOP commands on internal SAP tables which are processed by the system.

 

 

Further reading:

Note 381348

Viewing all 948 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>