Doxxy 1.2 has been released

Today we have great news for you: Docufy becomes Doxxy !
And not only the name improved!

Doxxy is a RAD-tool for generating operational reports. With its intuitive APEX UI, you easily configure your documents by adding DOCX-templates and SQL-queries. The engine is written in PL/SQL, which makes installation, integration and maintenance very straight forward. The tool comes as a packaged application for APEX 4.x.

The main concepts and principles are still the same:

  • Simple architecture and installation
  • User-friendly RAD-tool
  • Gathering data via MS Word templates
  • Datasets via SQL statements
  • Generation of DOCX documents
  • Easy integration with the development software of your choice
  • Master-detail structures possible

On the occasion of APEX World of last month, we released Doxxy 1.2. This version includes some interesting new features.

What is new?

First of all, Doxxy is a tool for developers: for APEX developers … surely, but in fact for anyone who is developing against an Oracle database and who needs a printable output. Until now, the reporting engine generated a .DOCX file as printable document. In version 1.2 there is an extra option available which makes it possible to have a PDF-document as output.

Other new features we added to the product are:

  1. Possibility to add some PL/SQL logic at the beginning or at the end of the generation process.
    Possible use-cases can be:
    a) set an Oracle context with a language indicator at the beginning of a report, or preparing your data in temporary tables to make the querying more easy.
    b) At the end you may use it for updating a print-status or – flag on given records.
  2. Performance optimalization for documents with a lot of content or with a lot of IF-statements
  3. The export –and import mechanism is XML based. It is now also possible to export/import multiple documents from a given folder in one run.
  4. Easy search-box to quickly find a document in the object tree
  5. Template visualization and validation: when you do an upload of a template, the system does some basic validations on the ‘formal’ content of the template, especially on the names of the tags.
    From within the Doxxy-UI you may also visualize the formal structure of your templates. Errors are visually emphasised in red.
  6. Simplified mechanism to include images (coming from a BLOB-column) into the report-output.
  7. Extra page to maintain your doxxy-specific private synonyms.

Give Doxxy a try and request a free trial,

Follow @doxxyNews on twitter.
Website: www.doxxy.eu.

OGH APEX World 2014

Last week we attended the the 5th annual APEX World event in Zeist. As every year it was very nice to meet the growing APEX community in the Benelux, combined with some excellent APEX international and dutch presentations.
The  keynote was given  by Joel Kallman about APEX 5.0 followed by 18 very interesting sessions about customer business cases, technical developments and international presentations by APEX specialist from all over the world.

APEX 5.0

The key focus in the new APEX 5.0 is improved developer productivity.oracle apex page designer
The page builder is completely new. Through this interface developers will be able to do more in less time and most important, in fewer clicks. With a properties sidebar on the right side of the screen it will be possible to quickly change elements and regions on a page, even multiple elements at the same time!  Regions and items can be created through drag and drop which increases the development speed.

Other new features

Tabs
Improved tab navigation. The current tab system isn’t user friendly enough, so it’s better to use lists. Now you can create new pages and define their hierarchy in the application. When this is done, an automatic tab will be created with dropdown submenus to display the hierarchy.

Interactive reports
Two important improvements for interactive reports. First and foremost it’s possible to have multiple interactive reports on one page, something we’ve all been waiting for since APEX 4.x. And secondly there is a new format function to pivot your report. Joel Kallman presented this feature: in a couple of clicks he created a nice pivoted table on the screen.

jQuery Mobile integration
With jQuery Mobile your SQL reports will have the possibility to be responsive. You have the option to:
a) only display the most important columns on a small screen, or
b) to switch to some kind of single record view. The result is something similar to what you can see here: http://elvery.net/demo/responsive-tables/

Modal popup
Instead of using a plugin to let your pages open in a modal window, users can now set this feature as a property of the page. Whenever the user navigates to this page, it will open in a modal window.

Be sure to take a look at the APEX early adaptor: apexea.oracle.com

 

Presentations

After the APEX 5.0 demonstration, there were 3 parallel tracks, all with very different and interesting sessions.  Read our impressions …

Going public with your APEX application
FOEX brought this presentation very well. Their problem scenario was the following one: If you want to make a public APEX application, you are always stuck with the typical APEX URL like “apex/f?p=100:1:5039230103::::”. During the demo they showed how to create a nice and readable URL like “apex/demo/customers”. To accomplish this they used aliases, REST services, PL/SQL and a few lines of javascript.

The best of both worlds: going hybrid with your mobile APEX application
Roel Hartman gave a presentation about Phonegap in combination with APEX. He showed a nice demo on how to sync the contacts from a database with the ones from his cell phone through a Phonegap App. It was surprising how easily this could be setup without too much code and in-depth knowledge. He used REST services to sync the data between APEX and his cellphone.

Using AngularJS in oracle applications express
Dan McGhan of Enkitec (USA) brought a technical session about combining AngularJS and APEX. He showed us a single page application containing a to do list with advanced calendar features. The end result was very nice and the demo illustrated the power of AngularJS, but it certainly requires some time to understand this framework. Maybe an interesting idea is to include AngularJS natively in APEX 6.0?

A B2B weboracle apex b2b webshop - tuur hendrickxshop with APEX!
iAdvise did two presentations. The first one dealt with a B2B webshop we developed in APEX for Billiet. Justine Ghekiere gave a brief introduction about the core business of her company, Biliet. Our colleague Tuur Hendrickx showed a lot of features he implemented in the webshop with APEX. Topics he show-cased were:  special advertisements, restricted products for different customers, the use of a shopping cart and a stunning layout were demonstrated.

APEX & HTML5
We also attended a nice presentation of Martin Giffy D’Souza about APEX and HTML5. He showed the advantages of HTML5 and the typical use cases in APEX. During a live demo he showed how to record a video within APEX and stream the feed to another frame in the same screen. Really impressive!  Also nice to see was how easily it is to implement voice recognition by using HTML5.

Dutch immigration services (IND) monitor xml messages with oracle apex
A department of the Dutch government has built an application which provides residence permits to immigrants or refugees. Before they could start building the APEX application there was a lot of effort necessary in the Oracle database for dealing with all the XML files. It was not just a problem with the size of the XML files, but there were also issues with differences between Oracle 10.2 and 11.2 in the way the database handles XML files.

Reporting solutions for oracle APEX – choose your weapons
During this session Dietmar Aust gave us an overview of possible reporting solutions  for APEX applications. Many solutions were covered in an objective way:  BI Publisher, Jasper Reports, Apache FOP, APEX PDF printing, PL/PDF, … Dietmar even demonstrated our own tool Doxxy (www.doxxy.eu). Nice to hear that he likes Doxxy! He also showed us his own solution for typical problems related to exporting data from interactive report to MS Excel, especially regarding the proper data types: OPAL:XP (for eXPorting to MS Excel).

Single-click deployment in APEX development
One of the last tracks we visited was about single-click deployment of APEX applications in OTAP areas. They talked about the use of bamboo, in combination with GIT and APEX. It was nice to see how they solved the problem of continuous integration with APEX.

A logistic data portal with APEX!oracle apex data portaal - menno hoogendijk
In the second iAdvise customer case Robert Esseling explained why Bas Logistics needed a data portal. Those requirements where then demonstrated by Menno Hoogendijk.
The portal has an admin module to manage the data import and mapping settings. In the very straight-forward  front-end, users drill down from dashboards to detailed data.

 

Thanks to the organization for hosting this great event, really one of the best conferences in the benelux!
See you at APEX World 2015!

wpg_docload.download_file : mime type not recognized by client

For a project we are currently working on, we needed to generate, and send a Word 2010 document to the client. The document was generated by a great PL/SQL document generation tool called Doxxy, and was sent to the client using the wpg_docload package. This is a standard Oracle pl/sql package that can be used to download files, BLOBs and BFILEs.

Before the download, we set the Content-type in the http header as follows :

owa_util.mime_header('application/vnd.openxmlformats-officedocument.wordprocessingml.document',FALSE);

When sending the document to the client, we got the following popup in our browser :

Image

So it looked like our browser didn’t recognized that this was an Word 2010 document.

Looking at the response header, using Firebug, we got the following result :

Image

Somehow the content type for Word 2010 was overwritten to text/html; charset=utf-8.

So, time for the good old trial and error approach, which, after a while, paid off.

Before setting the response header to : owa_util.mime_header(‘….’,FALSE); we need to issue the following commands :

htp.flush();
htp.init();

Now the code looks like this  :

-- first clear the header
 htp.flush;
 htp.init;
 -- set up HTTP header
 owa_util.mime_header('application/vnd.openxmlformats-officedocument.wordprocessingml.document', FALSE);
 -- set the size so the browser knows how much to download
 htp.p('Content-length: ' || DBMS_LOB.getlength(v_blob));
 -- the filename will be used by the browser if the users does a save as
 htp.p('Content-Disposition:attachment; filename="'||nvl(v_filename,'export')||v_ext||'"');
 -- Set COOKIE (for javascript download plugin)
 htp.p('Set-Cookie: fileDownload=true; path=/');
 -- close the headers
 owa_util.http_header_close;
 -- download the BLOB
 wpg_docload.download_file(v_blob);

After adding these 2 lines, we got the correct mime type :

Image

Many thanks to Willem Albert and Bjorn Fraeys for delivering the content for this blog !

Using Talend to read tweets

A project regarding visual discovery required me to look into the possibilities of reading tweets. During my search for a suitable method to accomplish this, I came across the software Talend Open Studio. This is an open source data integration product which allowed me to fulfil my need to read tweets in a very easy way. You won’t have to manually use the Twitter API to get raw JSON, but have the possibility to load and transform the structured data into a file.

In this post we’ll go through the steps you have to take to get the Talend software to use this API and get you the tweets you want. Please keep in mind that, in order to successfully do this, you will need a Twitter account to authenticate yourself and some custom components. Both will be discussed later on.

As you might know, it used to be possible to get an unlimited amount of tweets for free using the Twitter API v1. Unfortunately, Twitter decided to retire this API and replace it with the Twitter API v1.1 which requires you to authenticate before you get the (limited amount of) tweets you want.

Step 1: Adding custom Twitter components

Unfortunately, the components you need aren’t included in Talend by default. Therefore, you will need to add the following three items to your palette: tTwitterOAuth, tTwitterOAuthClose and tTwitterInput. These are made by Gabriele Baldassarre and can be downloaded on his website: http://gabrielebaldassarre.com. Place the components in a location that’s easy to remember, keeping in mind that you’ll need to use the locations path in a second. For example:

Location example

Now we’re going to add these to our palette. In Talend, go to “Window – Preferences – Talend – Components” and fill in the correct path next to ‘User component folder:’.

User component folder

Click on Apply/ok. As you can see the components can now be found in your palette:

Twitter palette

Step 2: Configuring tTwitterOAuth

tTwitterOAuth is responsible for the connection and authentication towards Twitter. As said before, in order to use this component you will need a Twitter account. Add the component to your job.

tTwitterOAuth

As you can see in the components settings, it requires 4 different strings (consumer key, consumer secret, access token and access token) in order to work. We’ll now explain how to get these.

Go to the Twitter developers website (https://dev.twitter.com/) and click on “My applications” (click on your profile picture in the upper right corner).

Twitter My applications

New Twitter application

Create a new application by clicking on “Create New App” (as seen in the picture above) and complete the given form. If you don’t have a website you can use a dummy link, for example your localhost (http://127.0.0.1).

Twitter application form

Upon returning to the applications page, you’ll see your recently created application. Go to its detail page by clicking on it.  If you click on “Test OAuth” you’ll find the information you need in order for your Talend component to work. This button can be found in the upper right corner as shown in the image below:

Test OAuth

Copy the consumer key, consumer secret, access token and access token secret into the matching fields of the tTwitterOAuth component. Keep in mind that these are strings and should therefore be surrounded by double quotes.

Settings tTwitterOAuth

Step 3: Linking tTwitterInput with tTwitterOAuth and tTwitterOAuthClose

Add a tTwitterInput component to your job. Whenever tTwitterOAuth has successfully authenticated us, we’ll tell our job to proceed to the tTwitterInput component. Do this by right-clicking tTwitterOAuth, selecting “Trigger – On subject OK“ and then clicking on tTwitterInput. The components are now linked. At this point, your job should look like this:

connection tTwitterOAuth/tTwitterInput

When tTwitterInput is done, we want to close our authentication. In order to do this, first we need to add a tTwitterOAuthClose component to the job. Next, tell the tTwitterInput component to proceed to the tTwitterOAuthClose component as done before: right click tTwitterInput, select “Trigger – On subject OK” and click on tTwitterOAuthClose.

Connection Twitter components

Step 4: Configuring tTwitterInput

tTwitterInput error

As you can see, at this moment we get the following error: “Parameter (Conditions) must have at least one value”, meaning we can’t just get any tweets we want but we have to specify at least one condition in order for this component to be able to search for tweets. For example, let’s say we only want tweets containing the hashtag Oracle. This is achieved by double-clicking on tTwitterInput and adding a condition (click on the green plus-sign) telling tTwitterInput to include the term “#Oracle”.

tTwitterInput basic settings

The error has now disappeared. You can add multiple operations and even decide whether they should have an AND or OR-structure. You can even add a filter telling it to only get negative or positive tweets, so technically this could be used for sentimental analysis.

Now we’re going to edit the column mapping in order to get the information we want. Keep in mind that as an example we’re only going to add a couple of basic operations. As you will see in the complete list there are many more possibilities then we’ll be showing. Click on the triple dots next to “Edit schema”.

tTwitterInput schema

TweetId is present in the schema by default. Let’s go ahead and add the content of the tweet (Text) and the date on which is was published (CreationDate).

tTwitterInput schema details

After you’re done editing the schema, click on OK. Now we have to change the operations of the column mapping to the right values. Change them to the following:

tTwitterInput mapping

Lastly, let’s specify some other characteristics of the tweets we want the component to return. Click on the “Advanced settings”-tab.  Let’s say we only want recently published information and limit the amount of tweets it should return by 100.

tTwitterInput advanced settings

As you can see, there are multiple conditions that you can edit.  Keep in mind that if you decide to specify a date, this should be done in a string-format (therefore, don’t forget to add double quotes or it won’t work). Also, there’s a limit on how far you can go back in time. As far as I’m aware the API is limited so that you can only receive tweets that have been published in the last week.

You’re now done and ready to get some tweets!

Step 5: Testing

Let’s put the received data into a CSV-document. Add a tFileOutputDelimited component to your job. Link it with your tTwitterInput by right clicking and selecting “Row – Structured” (you can also decide to write raw JSON if you prefer this) and clicking on tFileOutputDelimited. Edit the basic settings of your output file if you’d like to and run the job.

Twitter end result

Open the created file and voila, there are your tweets!

Tweets example

Watch out with function result cache based on data dictionary views

Result cache is a powerful tool to gain performance in PL/SQL.
There are many examples on the internet that proves this, e.g. these articles on All things Oracle:
- Result Cache(1)
- Result Cache(2)

But I’m not going to talk about performance.
This article is some kind of warning.

First I’ll show you how result cache works on a normal view.
I’ll create a table, a view on this table and a function that counts the rows in the view.

SQL> create table x (field1 varchar2(1), field2 number(1));

Table created.

SQL> create or replace view vie_x as select * from x;

View created.

SQL> CREATE OR REPLACE FUNCTION vie_x_rowcount(p_field1 IN VARCHAR2)
RETURN NUMBER RESULT_CACHE
IS
   l_return NUMBER;
BEGIN
   SELECT count(*)
     INTO l_return
     FROM vie_x
     WHERE field1 = p_field1;

   RETURN l_return;

END vie_x_rowcount;
/

Function created.

SQL> insert into x(field1, field2) values('x', 1);

1 row created.

SQL> commit;

Commit complete.

These are the statistics for the result cache, just to show you we’re starting without any caching.

SQL> select name, value from v$result_cache_statistics where name in ('Create Count Success', 'Find Count', 'Invalidation Count');

NAME			               VALUE
------------------------------ ------------------------------
Create Count Success	       0
Find Count		               0
Invalidation Count	           0

When we execute the function, the statistics show that there’s an entry created in the cache.

SQL> select vie_x_rowcount('x') from dual;

VIE_X_ROWCOUNT('X')
-------------------
		  1

SQL> select name, value from v$result_cache_statistics where name in ('Create Count Success', 'Find Count', 'Invalidation Count');

NAME			               VALUE
------------------------------ ------------------------------
Create Count Success	       1
Find Count		               0
Invalidation Count	           0

When we execute the same code again, we’ll get the same result and the statistics show us that the result is found in the cache.
Good job Oracle!

SQL> select vie_x_rowcount('x') from dual;

VIE_X_ROWCOUNT('X')
-------------------
		  1

SQL> select name, value from v$result_cache_statistics where name in ('Create Count Success', 'Find Count', 'Invalidation Count');

NAME			               VALUE
------------------------------ ------------------------------
Create Count Success	       1
Find Count		               1
Invalidation Count	           0

Let’s insert a new row in the table.
This time the statistics show us that the cache is “invalidated”, meaning the function has to be executed again to return the correct value.

SQL> insert into x values('x', 2);

1 row created.

SQL> commit;

Commit complete.

SQL> select name, value from v$result_cache_statistics where name in ('Create Count Success', 'Find Count', 'Invalidation Count');

NAME			               VALUE
------------------------------ ------------------------------
Create Count Success	       1
Find Count		               1
Invalidation Count	           1

And the expected result…

SQL> select vie_x_rowcount('x') from dual;

VIE_X_ROWCOUNT('X')
-------------------
		  2

The Oracle database has its own data dictionary, a set of tables where it stores all information about the database and what’s in it.
Data of these tables are available through views, data dictionary views.
In the following example I’ll use the data dictionary view that holds the information on columns.
I created a function that returns the number of columns for a certain table.

SQL> CREATE OR REPLACE FUNCTION number_of_columns(p_table_name VARCHAR2)
RETURN NUMBER RESULT_CACHE
IS

   l_return NUMBER;

BEGIN

   SELECT count(*)
     INTO l_return
     FROM user_tab_columns
    WHERE table_name = p_table_name;

   RETURN l_return;

END number_of_columns;
/

Function created.

To make sure we’ll start with a clean cache, I’ll flush it using the dbms_result_cache.flush procedure.

SQL> execute dbms_result_cache.flush

PL/SQL procedure successfully completed.

SQL> select name, value from v$result_cache_statistics where name in ('Create Count Success', 'Find Count', 'Invalidation Count');

NAME			               VALUE
------------------------------ ------------------------------
Create Count Success	       0
Find Count		               0
Invalidation Count	           0

When we execute the function, we’ll get the expected result: the function is executed and a cache entry is created.

SQL> select number_of_columns('X') from dual;

NUMBER_OF_COLUMNS('X')
----------------------
		     2

SQL> select name, value from v$result_cache_statistics where name in ('Create Count Success', 'Find Count', 'Invalidation Count');

NAME			               VALUE
------------------------------ ------------------------------
Create Count Success	       1
Find Count		               0
Invalidation Count	           0

We can execute it again and see that the return value is retrieved from the cache.

SQL> select number_of_columns('X') from dual;

NUMBER_OF_COLUMNS('X')
----------------------
		     2

SQL> select name, value from v$result_cache_statistics where name in ('Create Count Success', 'Find Count', 'Invalidation Count');

NAME			               VALUE
------------------------------ ------------------------------
Create Count Success	       1
Find Count		               1
Invalidation Count	           0

Let’s add a column to the table.
This should add a new row in a data dictionary table and thus in the data dictionary view we use in our function.

SQL> alter table x add (field3 date);

Table altered.

SQL> desc x
 Name					   Null?    Type
 ------------------------- -------- ----------------------------
 FIELD1 					        VARCHAR2(1)
 FIELD2 					        NUMBER(1)
 FIELD3 					        DATE

Now execute the function again.
And the result is…

SQL> select number_of_columns('X') from dual;

NUMBER_OF_COLUMNS('X')
----------------------
		     2

Not what we expected!
When we take a look at the result cache statistics, it shows that the cache wasn’t invalidated and the result was retrieved from the result cache.

SQL> select name, value from v$result_cache_statistics where name in ('Create Count Success', 'Find Count', 'Invalidation Count');

NAME			               VALUE
------------------------------ ------------------------------
Create Count Success	       1
Find Count		               2
Invalidation Count	           0

When we flush the cash and execute the function again, we’ll get the correct result cache.

SQL> execute dbms_result_cache.flush

PL/SQL procedure successfully completed.

SQL> select number_of_columns('X') from dual;

NUMBER_OF_COLUMNS('X')
----------------------
		     3

So, it seems that the result cache isn’t invalidated on data dictionary tables.
And indeed this is what I found in the Oracle documentation:

You cannot cache results when the following objects or functions are in a query:

  • Temporary tables and tables in the SYS or SYSTEM schemas

APEX 5.0: Modal dialogs have never been so easy!

Like all of us, I have had the pleasure to take a look at APEX 5.0 EA1. One of the most anticipated features was build-in support for modal dialogs. In this blog we will examine this feature more closely.

An easy way to investigate this feature is by creating two simple pages: report & form using the wizard.

0Modaldialogs

Follow the wizard and base the report & form on the demo_customer table. When you come to the page where you are creating the form then you will notice the option “Page Mode”. This mode will determine the type of page you use.

1wizard

As you can see there are three modes:
-          Normal: used by normal pages
-          Modal: when this is selected your page will show as a modal dialog
-          Non – Modal:  in this mode the page will act as a normal pop-up allowing you to continue working in both windows. Note that the original window that called the pop-up can still interact with this pop – up

We select the modal mode, and continue the wizard. We have now created two pages with one being a modal dialog. So we see here that the modal dialog is a separate page, and not some region on the page that calls it. This will allow better re-usability, and easier validations. You might notice there is no code anywhere that actually calls the modal page. That is because any page that branches, links or redirects to a page with mode “Modal”, will call that page as a modal dialog automatically.

Apart from the mode difference between the two pages, APEX also created a “Close Dialog” process, and a dynamic action that fires when the cancel button is pressed on our modal page. Both of these will make sure our modal dialog is closed when the end-user presses a button.

2Close dialog

Lastly we might want our report to refresh when the dialog is closed. To do this we go to our report page and right click on our report to create a dynamic action. As event select “Dialog Closed” and region “Customers”.

3DA event4DA action

Then right click on your Dynamic action and create a Refresh action with affected item our Customers region. Now our Customers region will be refreshed when our modal dialog closes.

Time to test our page!

5Dialog

As you can see our modal dialog renders nicely.

There is one last note I would like to add, for the developers who like to write the code for their modal dialog themselves, the APEX developers have enabled us to do so. When you go to the theme for our modal page, then in the bottom you will notice these codes:

6Dialog init code

It basically means that, should you wish to, you can modify this code here, or just replace it with your own, allowing for maximum flexibility!

In conclusion I think it is safe to say that the APEX developers have done an ACE job implementing this feature, thumbs up! If you would like to try the EA and modal dialogs for yourself then you can do that here. If you want to read up on the full list of APEX EA1 features you can do that here.

Browser and Session value in APEX

For those of us who have been developing in APEX for a while will understand that the value of an item in APEX is not always what it seems. That’s right; an item has two values in APEX. The first value is the session value that we can store in the database (server side), the second value is the value the end user sees on his screen in the browser (client side). On some occasions these values are the same. On others they are not. The difference in value can lead to some confusion, especially for the new APEX developers.

Let’s start with an example so we understand the problem. This is actually based on a real use case. I changed the item names and queries to make it more universal, but the principle remains the same. The developer in question had a page with two Select lists containing the following queries:


SELECT 1 d, 1 r
FROM dual

And

SELECT 1 d, 1 r
FROM DUAL
UNION
SELECT 2 d, 2 r
FROM DUAL

The second select list (called P55_SELECT_LIST_2), also had a default value of 2.

When the user selects a different value on the select list then the second select list had to change values accordingly.

In order to achieve this the developer had created a Dynamic Action. As event he had a change Event of the P55_SELECT_LIST item. The first action was a set value containing the following query (again made simple for the example):

SELECT 1
FROM dual
WHERE  1 = :P55_SELECT_LIST

And page items to submit he had selected the item P55_SELECT_LIST, and affected items he put P55_SELECT_LIST_2

The last action he set was a refresh action of our P55_SELECT_LIST_2 item.

Our page setup now looks like this:

1pagesettup

The developer tried his application, and when he changed the first select list he saw the second select list being refreshed. Yet, instead of displaying the correct value (1), it displayed the default value.

2items

So what went wrong?

Let’s see what exactly happened to figure that out:

  1. The page was loaded. The first select list had nothing selected, the second select list had “2” selected in the browser. Both items were empty in the session3session
  2. The value of the first select list was changed by the end user
  3. This triggered the change event defined in the dynamic action
  4. The first action “Set value” was executed. The value in the browser was changed. The session value of the item P55_SELECT_LIST  was set to one, because it was set with items to submit. An AJAX request was made to set this value. The value item P55_SELECT_LIST_2 remains NULL.4session
  5. The refresh action was executed; an AJAX request was made to get new values for the select list. Since the SELECT_LIST_2 item has no value in the session, the default value is taken, which is 2!                                                                                                                          5network

The solution was simple. Just remove the refresh action. Then the value is set in the browser correctly.  This is handled by APEX with JavaScript/jQuery. The session value of the item will remain NULL until we submit the item to the database by a page submit, a dynamic action, or by changing the value in an AJAX Callback process.