For training on SAP Basis and Netweaver with Oracle DB Administration, contact me at shrikshetty@gmail.com
Thursday, December 27, 2012
SPAM Steps
The SAP Patch Manager informs you of the status of the step currently being executed in the status bar. If you want to know which steps are executed for which scenario, run the program RSSPAM10.
The following list explains the steps executed by SPAM in the order in which they are performed.
PROLOGUE
In this step, the system checks if you are authorized to import Support Packages.
CHECK_REQUIREMENTS
In this step, various requirements are checked for importing a Support Package, for example, the transport control program tp logging on to your system.
DISASSEMBLE
In this step, files from the corresponding EPS parcels are disassembled, or unpacked, and stored in the directory.
ADD_TO_BUFFER
In this step, the queue is put into the transport buffer of your system.
TEST_IMPORT
In this step, the system checks if there are still objects in tasks that have not yet been released and are overwritten when the Support Package is imported.
IMPORT_OBJECT_LIST
In this step, the object lists for the Support Packages in the queue are imported into the system.
OBJECTS_LOCKED_?
In this step, the system checks if there are still objects in tasks that have not yet been released and are overwritten when the Support Package is imported.
SCHEDULE_RDDIMPDP
In this step, the transport daemon (program RDDIMPDP) is scheduled.
ADDON_CONFLICTS_?
In this step, the system checks if there are conflicts between objects in the queue and installed add-ons.
SPDD_SPAU_CHECK
In this step, the system checks if a modification adjustment is necessary (Transactions SPDD/SPAU).
DDIC_IMPORT
In this step, all the ABAP Dictionary objects of the queue are imported.
AUTO_MOD_SPDD
In this step, the system checks if modifications to the ABAP Dictionary objects can be adjusted automatically.
RUN_SPDD_?
In this step, you are asked to adjust your modifications to the ABAP Dictionary objects by calling Transaction SPDD.
IMPORT_PROPER
In this step, all the Repository objects and table entries are imported. Then actions such as distribution, conversion, activation and generation occur.
AUTO_MOD_SPAU
In this step, the system checks if modifications can be adjusted automatically.
RUN_SPAU_?
In this step, you are asked to adjust your modifications to the Repository objects by calling Transaction SPAU.
EPILOGUE
In this step, the import of the Support Package is ended. The system checks if the queue has been completely processed.
Wednesday, December 26, 2012
AS ABAP Work Process Overview
There are 5 types of Work Process:
Dialog Work Process (DIA): Every user request is processed by Dialog Work Process. Normal response time of a Dialog Work Process is 6 ms. Every instance should have a minimum of 2 Dialog Work Processes and can be increased to more as per requirement.
Background Work Process (BCK): All long running batch jobs and reports where user interaction is not required are taken care by Background Work Process. Every instance should have a minimum of 1 Background Work Processes and can be increased to more as per requirement.
Update Work Process (V1, V2): All udpate related request are taken care by Update Work process. This is of two types, Synchronous updates (V1) and Asynchronous updates (V2). Every instance should have a minimum of 1 Update Work Processes and can be increased to more as per requirement.
Enqueue Work Process (ENQ): Enqueue work process implements lock mechanism. When two users are trying to update same data in a table then Enqueue work process locks that table for other user and releases it when first user saves an commits it. Every instance should have 1 Enqueue Work Processes and cannot be increased.
Spool Work Process (SPO): All print related requests are handled by Spool Work process. Every instance should have a minimum of 1 Spool Work Processes and can be increased to more as per requirement.
Work Process Architechture:
Work processes execute the process logic of application programs. In addition to
internal memory, a work process has a task handler that coordinates the actions within
a work process, software processors and a database interface. The dynpro processor
executes the screen flow logic of the application program, calls processing logic
modules, and transfers field content to the processing logic. The actual processing
logic of ABAP application programs is executed by the ABAP interpreter. The
screen processor tells the ABAP processor which subprogram needs to be executed,
depending on the processing status of the screen flow logic.
The dialog work process selected by the dispatcher, performs a roll-in of the user
context first. That is, the data that contains the current processing status of a running
program as well as data that characterizes the user is made known to the work process.
The work process then processes the user request, which may involve, for example,
requesting data from the database or from the buffers in the shared memory. Once the
dialog work process has processed the dialog step, the work process returns the result,
rolls the user context back out to the shared memory, and is now available again
for a new user request from the request queue. The result is transferred to the SAP
GUI and the user sees the new screen.
Database Interface of AS ABAP:
Relational Database Management Systems (RDBMS) are generally used to manage
large sets of data. An RDBMS saves data and relationships between data in the form
of two-dimensional tables. These are known for their logical simplicity. Data, tables,
and table relationships are defined at database level in the database catalog (the data
dictionary) of the RDBMS.
Within the SAP programming language ABAP, you can use ABAP Open SQL (SQL =
Structured Query Language, database query language) to access the application data
in the database, regardless of the RDBMS used. The database interface, which is part
of every work process of AS ABAP, translates Open SQL statements from ABAP into
the corresponding SQL statements for the specific database used (Native SQL). This
allows ABAP programs to be database-independent.
When interpreting Open SQL statements, the SAP database interface checks the
syntax of these statements and ensures the optimal utilization of the local SAP buffers
in the shared memory of the application server. Data that is frequently required by
the applications is stored in these buffers so that the system does not have to access
the database server to read this data. In particular, all technical data, such as ABAP
programs, screens, and ABAP Dictionary information, as well as a number of business
administration parameters, usually remain unchanged in an operational system and
are therefore ideally suited to buffering.
Dialog Work Process (DIA): Every user request is processed by Dialog Work Process. Normal response time of a Dialog Work Process is 6 ms. Every instance should have a minimum of 2 Dialog Work Processes and can be increased to more as per requirement.
Background Work Process (BCK): All long running batch jobs and reports where user interaction is not required are taken care by Background Work Process. Every instance should have a minimum of 1 Background Work Processes and can be increased to more as per requirement.
Update Work Process (V1, V2): All udpate related request are taken care by Update Work process. This is of two types, Synchronous updates (V1) and Asynchronous updates (V2). Every instance should have a minimum of 1 Update Work Processes and can be increased to more as per requirement.
Enqueue Work Process (ENQ): Enqueue work process implements lock mechanism. When two users are trying to update same data in a table then Enqueue work process locks that table for other user and releases it when first user saves an commits it. Every instance should have 1 Enqueue Work Processes and cannot be increased.
Spool Work Process (SPO): All print related requests are handled by Spool Work process. Every instance should have a minimum of 1 Spool Work Processes and can be increased to more as per requirement.
Work Process Architechture:
Work processes execute the process logic of application programs. In addition to
internal memory, a work process has a task handler that coordinates the actions within
a work process, software processors and a database interface. The dynpro processor
executes the screen flow logic of the application program, calls processing logic
modules, and transfers field content to the processing logic. The actual processing
logic of ABAP application programs is executed by the ABAP interpreter. The
screen processor tells the ABAP processor which subprogram needs to be executed,
depending on the processing status of the screen flow logic.
The dialog work process selected by the dispatcher, performs a roll-in of the user
context first. That is, the data that contains the current processing status of a running
program as well as data that characterizes the user is made known to the work process.
The work process then processes the user request, which may involve, for example,
requesting data from the database or from the buffers in the shared memory. Once the
dialog work process has processed the dialog step, the work process returns the result,
rolls the user context back out to the shared memory, and is now available again
for a new user request from the request queue. The result is transferred to the SAP
GUI and the user sees the new screen.
Database Interface of AS ABAP:
Relational Database Management Systems (RDBMS) are generally used to manage
large sets of data. An RDBMS saves data and relationships between data in the form
of two-dimensional tables. These are known for their logical simplicity. Data, tables,
and table relationships are defined at database level in the database catalog (the data
dictionary) of the RDBMS.
Within the SAP programming language ABAP, you can use ABAP Open SQL (SQL =
Structured Query Language, database query language) to access the application data
in the database, regardless of the RDBMS used. The database interface, which is part
of every work process of AS ABAP, translates Open SQL statements from ABAP into
the corresponding SQL statements for the specific database used (Native SQL). This
allows ABAP programs to be database-independent.
When interpreting Open SQL statements, the SAP database interface checks the
syntax of these statements and ensures the optimal utilization of the local SAP buffers
in the shared memory of the application server. Data that is frequently required by
the applications is stored in these buffers so that the system does not have to access
the database server to read this data. In particular, all technical data, such as ABAP
programs, screens, and ABAP Dictionary information, as well as a number of business
administration parameters, usually remain unchanged in an operational system and
are therefore ideally suited to buffering.
Friday, December 7, 2012
Scheduling Standard Background Jobs
Scheduling Standard Background Job is one of the Post Install Activity.
Below are the steps to schedule Standard jobs:
Goto transaction SM36
Goto transaction SM36 and click on "standard jobs" pushbutton. This inturn displays standard jobs screen. Here select all the jobs and click on "default scheduling" push button to schedule all of them as per their default schedule.
Incase you would like to change the default schedule for each job, it can also be done by selecting each job and defining its start date/time and periodicity in the same screen.
What is the need of scheduling basis standard jobs ?
Standard jobs are the jobs that should run regularly in the SAP system. These jobs will perform housekeeping like deleting old spool requests (thus avoiding spool overflow), deleting old background jobs/logs/updates/batch input sessions/ABAP short dumps, collecting operating system/database level statistics (used for workload reporting) etc
What are the Standard jobs that should run in an SAP system & their significance ?
SAP_CCMS_MONI_BATCH_DP : Internally this job runs
RSAL_BATCH_TOOL_DISPATCHING report. This job dispatches monitoring architecture methods
SAP_COLLECTOR_FOR_JOBSTATISTIC : Internally this job runs RSBPCOLL report. This job generates run time statistics for background jobs
SAP_COLLECTOR_FOR_PERFMONITOR : Internally this job runs RSCOLL00 report. This job collects data for the performance monitor
SAP_COLLECTOR_FOR_NONE_R3_STAT : Internally this job runs RSN3_STAT_COLLECTOR report. This job will collect non-abap statistic data (Distributed Statistic Records - DSR)
SAP_REORG_ABAP_DUMPS : Internally this job runs RSSNAPDL report. This job cleans up old abap short dumps
SAP_REORG_BATCH_INPUT : Internally this job runs RSBDCREO report. This job cleans up old batch input sessions
SAP_REORG_JOBS : Internally this job runs RSBTCDEL report. This job cleans up old background jobs
SAP_REORG_JOBSTATIC : Internally this job runs RSBPSTDE report. This job cleans up old data from the run time statistics of the jobs
SAP_REORG_ORPHANED_JOBLOGS : Internally this job runs RSTS0024 report. This job cleans up orphaned job logs. The logs that cannot be deleted by RSBTCDEL report (i.e SAP_REORG_JOBS), remains as orphans which will be deleted by this job.
SAP_REORG_SPOOL : This job internally runs RSPO0041 report. This job deletes old spool data
SAP_REORG_XMILOG : This job internally runs RSXMILOGREORG. This job deletes XMI logs
SAP_SOAP_RUNTIME_MANAGEMENT : This job internally runs RSWSMANAGEMENT report. This job does the SOAP runtime monitoring
SAP_REORG_UPDATERECORDS : This job internally runs RSM13002 report and this deletes old update records
SAP_REORG_BATCH_INPUT : Internally this job runs RSBDCREO report. This job cleans up old batch input sessions
SAP_REORG_JOBS : Internally this job runs RSBTCDEL report. This job cleans up old background jobs
SAP_REORG_JOBSTATIC : Internally this job runs RSBPSTDE report. This job cleans up old data from the run time statistics of the jobs
SAP_REORG_ORPHANED_JOBLOGS : Internally this job runs RSTS0024 report. This job cleans up orphaned job logs. The logs that cannot be deleted by RSBTCDEL report (i.e SAP_REORG_JOBS), remains as orphans which will be deleted by this job.
SAP_REORG_SPOOL : This job internally runs RSPO0041 report. This job deletes old spool data
SAP_REORG_XMILOG : This job internally runs RSXMILOGREORG. This job deletes XMI logs
SAP_SOAP_RUNTIME_MANAGEMENT : This job internally runs RSWSMANAGEMENT report. This job does the SOAP runtime monitoring
SAP_REORG_UPDATERECORDS : This job internally runs RSM13002 report and this deletes old update records
Scheduling Background Job after Triggering an Event
Step1: Create event from transaction SM62.
Give event name and description and press save button
Step2: Create a program that triggers this event by calling the FM 'BP_EVENT_RAISE'.
*&---------------------------------------------------------------------*
*& Report Z_TRIGGER_EVENT *
*& *
*&---------------------------------------------------------------------*
*& *
*& *
*&---------------------------------------------------------------------*
REPORT Z_TRIGGER_EVENT .
CALL FUNCTION 'BP_EVENT_RAISE'
EXPORTING
eventid = 'Z_TRIGGER_JOB'
EXCEPTIONS
BAD_EVENTID = 1
EVENTID_DOES_NOT_EXIST = 2
EVENTID_MISSING = 3
RAISE_FAILED = 4
OTHERS = 5
.
IF sy-subrc <> 0.
Write: 'Event failed to trigger'.
else.
Write: 'Event triggered'.
ENDIF.
Step3: Configure the background job from transaction SM36.
In the initial screen give job name and job class and press "Start condition" button. 
In the popup screen press "After event" button and give the event name and then press save button.

Now go back to the initial screen and press "Step" button

Provide program and variant name and after providing all the values press save button.

In the initial screen press save button.

Step4: Now execute the program to trigger the event and as well check the background job.
Run transaction SM37
In the popup screen press "After event" button and give the event name and then press save button.
Now go back to the initial screen and press "Step" button
Provide program and variant name and after providing all the values press save button.
In the initial screen press save button.
Step4: Now execute the program to trigger the event and as well check the background job.
Run transaction SM37
Check the status of job created by the program.
Now check the spool to see the generated list
Subscribe to:
Posts (Atom)