Wednesday, November 06, 2013

APEX Listener supported App Servers

     With the latest news on Glassfish, I thought it may be a good time to review the options for the APEX Listener to deploy.  The huge caveat is this is as of today, 11/6/2013 , the future can change anything however there’s nothing planned.

The Licenses

I'm just putting the important parts here for reference.  They are linked to the entire license.

 The APEX Listener is only available via OTN.  When that is downloaded, this is the license  everyone has to agree to:

“We grant you a nonexclusive, nontransferable limited license to use the programs solely for your internal business operations subject to the terms of this agreement and the license grants and restrictions set forth with your licensing of such Oracle database programs. We also grant you a nonexclusive, nontransferable limited license to use the programs for purposes of developing your applications. This agreement does not grant any rights or licenses to Oracle database programs.”

  • A restricted-use license for Oracle Application Server Containers for J2EE (OC4J) is included with all editions (except for Oracle Database Express Edition). This embedded version is provided solely to support Oracle Enterprise Manager (Database and Grid Control), Advanced Queuing Servlet, Ultra Search, Application Express, and Warehouse Builder, and may not be used or deployed for other purposes.

  • A restricted-use license for Oracle Application Express Listener is included with all editions (except for Oracle Database Express Edition) solely to support connectivity to Oracle Application Express, which may be installed in the Oracle Database. Running Oracle Application Express Listener on a different server does not require that that server be licensed to use the Oracle Database.

Supported Application Servers

The APEX Listener’s documentation lists the supported webservers for deployment.

Supported Java EE Application Servers
Oracle Application Express Listener supports the following Java EE application servers: 
Application Server
Supported Release
Oracle WebLogic Server
11g Release 1 (10.3.3) or later
Oracle GlassFish Server
Release 3 or later

Clarifications on the Application Servers

I think some of the confusion is the wording of “Oracle Glassfish Server.” This was not intended to limit support to the commercial version of Glassfish.  We support the APEX Listener no matter which Glassfish you are using.  The support for the APEX Listener is tied directly to the database support agreement.  To state the obvious, the APEX Listener team is not supporting any application server but our code base running inside that application server.  If an issue is in the webserver itself arises, the customer will have to track that issue down with the webserver.

This table of supported Application Server will soon be updated to the following.

Application Server
Supported Release
Oracle WebLogic Server
11g Release 1 (10.3.3) or later
GlassFish Server
Release 3 or later
Apache Tomcat
Release 6 or later

Tuesday, October 01, 2013

How to use RESTful to avoid DB Links with ā'pěks

So the question came up of avoiding a db link by using the APEX Listener's RESTful services to get at the same data.  This is all in the context of an Apex application so apex_collections is the obvious place to stuff transient data that could be used over a session.

Step 1:  Make the RESTful Service.
The only catch is to turn pagination off ( make it zero ).  I didn't need it for now so this code that follows doesn't account for it.

Step 2. Install PL/JSON 

  In the future this can be done with built in support for JSON in the database.

Step 3.  Glue code.

  g_collection_name varchar2(32767) := 'json_data';
-- JSON Portion
  l_url  varchar2(32767) := '';
  l_jsonObj json;
  l_jsonObj2 json;
  l_jsonKeys json_list;
  l_key varchar2(32767);
  l_value varchar2(32767);
  l_tempdata json_value;
  l_temparray json_list;
  i number;  
  ii number;  
 values_array dbms_sql.varchar2_table;

  procedure varray2collection(p_values dbms_sql.varchar2_table) as
    l_sql varchar2(32767) := 'begin APEX_COLLECTION.ADD_MEMBER (p_collection_name =>:NAME,';  
    i number;
    c number;
    ret number;
  -- build up the begin..end sql block
    for i in 1..p_values.count loop
      l_sql := l_sql || 'p_c00'|| i ||'=>:c'||i||',';
    end loop;
    -- chop off the last comma
    l_sql := substr(l_sql,1,length(l_sql)-1);
    -- add the end
    l_sql := l_sql || '); end;';
    -- bind in the collection name

    -- bind in the values
    for i in 1..p_values.count loop
    end loop;
    ret := dbms_sql.execute(c);
    exception when others then
  -- simple function to remove the leading and trailing "
  function trimQuotes(s varchar2) return varchar2 is
    l_value varchar2(32767);
    l_value := substr(s,2);
    l_value := substr(l_value,1,length(l_value)-1);
    return l_value;
  -- function to grab the URL and return a JSON object
  function getURL(p_url varchar2) return JSON is
       -- HTTP Portion
      l_http_request   UTL_HTTP.req;
      l_http_response  UTL_HTTP.resp;
      l_clob           CLOB;
      l_text           VARCHAR2(32767);
      -- Initialize the CLOB.
      DBMS_LOB.createtemporary(l_clob, FALSE);
      -- Make a HTTP request and get the response.
      l_http_request  := UTL_HTTP.begin_request(l_url);
      l_http_response := UTL_HTTP.get_response(l_http_request);
      -- Copy the response into the CLOB.
             UTL_HTTP.read_text(l_http_response, l_text, 32766);
             DBMS_LOB.writeappend (l_clob, LENGTH(l_text), l_text);
      END LOOP;

    -- Relase the resources associated with the temporary LOB.
    dbms_output.put_line('Length:'|| DBMS_LOB.getlength(l_clob));
  return json(l_clob);
  end geturl;
-- apex collection

    l_jsonObj := getURL(l_url);
    -- grab the items 
    l_temparray:= json_list(l_jsonObj.get('items'));
--    temparray := json_list(tempdata);
    dbms_output.put_line( 'Count:'|| l_temparray.count);
    for i in 1..l_temparray.count loop
         -- get the Nth item
          l_tempdata :=    l_temparray.get(i);
          dbms_output.put_line( 'Record['|| i||']:'|| l_tempdata.to_char);
          -- bring the Nth item into it's own json object
         l_jsonObj2 := json(l_tempdata);
         -- get all the keys to walk
         l_jsonKeys := l_jsonobj2.get_keys();
                   for ii  in 1..l_jsonKeys.count  loop
                      l_key := trimQuotes(l_jsonKeys.get(ii).to_char);
                      l_value :=  trimQuotes(l_jsonObj2.get(l_key).to_char);
                      dbms_output.put_line(l_key || '=' ||l_value);
                      -- build up the array to pass to the apex collection
                      values_array(ii) := l_value;
                   end loop;
                   -- send it to the collection
        end loop;

Step 4. Make a page in Application Express

  The glue code is in the process named Get JSON.
  The interactive report is simply: select * from apex_collections.

Step 5. It's alive

Oh the ā'pěks is from the oraclenerd store.  If you don't have a shirt already, you should.

Friday, September 27, 2013

RESTful Cursor support for JSON

Just a real quick blog before I forget.  In the latest APEX Listener 2.0.4 patch, there's support for nested cursors.  There is two gotchas.  First make sure to to disable ( make 0 ) the pagination of the REST definition.  The second is this only works at the top level, so not nested nests of nests.

This is a very quick example of tables and nested in each table the columns and indexes that tables has.

select t.table_name,
  cursor(select column_name
             from user_tab_cols tc 
             where tc.table_name = t.table_name) cols ,

  cursor(select index_name
             from user_indexes i 
             where i.table_name = t.table_name) indexes
 from user_tables t

The resulting JSON looks like this.

Tuesday, August 20, 2013

Chunked File loading with APEX Listener + HTML5

  I just found the HTML5 File API the other day so I had to see what I could do with the APEX Listener's RESTful services. There's a bunch of blogs on what can be done such as on HTML5rocks  .

  The end result is that the new File api let's javascript get details of the file and slice it up into parts. Then I made a pretty simple REST end point to receive the chunks and put them back together again.

The actual sending part of the javascript is here
function sendChunk(chunkNumber){
        var reader = new FileReader();
        var start = chunkSize * (chunkNumber-1);
        var end = start + chunkSize -1;
      // create the slice of the file
        var fileContent = selectedFile.slice(start, end);   
      // grab the length
        var length = fileContent.size;
      // read the slice of the file
        url: uri,
        type: "POST",
        data: fileContent,
        processData: false,
        beforeSend: function(xhr) {
          // pass in the chunk size,offset,name 
          // as headers
                        xhr.setRequestHeader('x-chunknumber', chunkNumber);
                        xhr.setRequestHeader('x-offset', start );
                        xhr.setRequestHeader('x-chunksize', length );
                        xhr.setRequestHeader('x-content-type', selectedFile.type );

        success: function (data, status) {
          bytesUploaded += length;
          // set the percent complete 
          var percentComplete = ((bytesUploaded / selectedFile.size) * 100).toFixed(2);          
          $("#fileUploadProgress").text(percentComplete + " %");
          // make a link to the REST that can deliver the file
          $("#downloadLink").html("New File");
          // if there's more chunks send them over
          if ( chunkNumber < chunks ) {
        error: function(xhr, desc, err) {

The next step is to make the HTTP Headers into bind variable so the plsql block will be able to use them.

  p_b         blob;
  p_body      blob;
  p_offset    number;
  p_filename  varchar2(4000);
  p_raw       long raw;
  p_chunksize varchar2(200);
  p_status    varchar2(200);
  -- pull the binds into locals
  p_offset    := :OFFSET + 1;
  p_body      := :body;
  p_filename  := :filename;
  p_chunksize := :chunksize;

  -- If there is a file already with this name nuke it since this is chunk number one.
  if ( :chunkNumber = 1 ) then
    p_status := 'DELETING';
     delete from chunked_upload
      where filename = p_filename;
  end if;

  -- grab the blob storing the first chunks
  select blob_data
         into p_b
    from chunked_upload
   where filename = p_filename
    for update of blob_data;

  p_status :=' WRITING';

  -- append it
  dbms_lob.append(p_b, p_body);

  -- if no blob found above do the first insert
  when no_data_found then
     p_status :=' INSERTING'; 
     insert into CHUNKED_UPLOAD(filename,blob_data,offset,content_type)
      values ( p_filename,p_body,p_offset,:contenttype);  
  when others then
      -- when something blows out print the error message to the client

A very simple html page for testing it all out.

Here's a quick video of how it all works.

Here's the complete Javascript/html for this sample.

JS Bin

Friday, August 09, 2013

Data Modeler 4.0 EA1

  SQL Developer Datamodeler 4.0 EA1 is here.  There's lots of new things in there including support for many of the 12c features such as Identity data types, data masking , longer varchar2, and more.  There's still many other new features and tweaks for the non-bleeding edge 12 installs.  See the full list here.

 As always go download it from OTN, and let us know on the forums what you like or find !

Thursday, August 08, 2013

Tailing work in the db as it run with pipes

  Doesn't seem that long ago, I needed to monitor a process as it ran.  So, I wrote a small wrapper over dbms_pipe to look like dbms_output so it was easy to switch back and forth.  That was in 2004 so the file's timestamp says.  Time flies !

  DBMS_PIPE is nothing new and many have blogged it's good and bad.  There's lots more but here's just a few:

  Here's the wrapper I made and the body is here:

   pv_pipe_on_bln BOOLEAN := false;
   PROCEDURE set_pipeoutput_on;

   PROCEDURE put_line (p_message_txt VARCHAR2);
   PROCEDURE put_line (p_pipe_name VARCHAR2,p_message_txt VARCHAR2);
   PROCEDURE get_line (p_waittime_num NUMBER := 1);
   PROCEDURE get_line (p_pipe_name VARCHAR2,p_waittime_num NUMBER := 1);
   PROCEDURE get_line (p_pipe_name IN VARCHAR2,p_waittime_num IN NUMBER := 1,p_message OUT VARCHAR2);
END pipe_output;

So what that would allow me to do in the plsql block was something like this.

  for r in ( select object_name from user_objects ) loop
  end loop;

Now that something is in the pipe it's time to get it out. For that I wrote a small java program.  The same thing can be done direct from a sqlplus script or perl or whatever, it's not complicated.

import java.math.BigDecimal;
import java.sql.CallableStatement;
import java.sql.Connection;
import java.sql.DriverManager;
import java.sql.Types;

public class TailDBMSPipe extends Thread {
    private String v_pipe_name;
    private String connName;

    public TailDBMSPipe(String pipeName) {
        v_pipe_name = pipeName;

    public static void main(String[] args) {
        TailDBMSPipe p = new TailDBMSPipe(args[0]);
        p.connName = "jdbc:oracle:thin:" + args[1];
        //p.connName = "jdbc:oracle:thin:my_customers/";;

    public void run() {
        System.out.println("Redirecting database pipe:" + v_pipe_name);
        Connection conn = null;

        try {
            conn = DriverManager.getConnection(connName);

            String s = null;
            BigDecimal retVal;
            BigDecimal bigZero = new BigDecimal(0);

            while (true) {
                CallableStatement cs = conn.prepareCall("{ call ? := DBMS_PIPE.RECEIVE_MESSAGE(?, ?) }");
                cs.registerOutParameter(1, Types.INTEGER);
                cs.setString(2, v_pipe_name);
                cs.setString(3, "100");
                retVal = cs.getBigDecimal(1);

                if (retVal.compareTo(bigZero) == 0) {
                    cs = conn.prepareCall("{ call DBMS_PIPE.UNPACK_MESSAGE( ?) }");
                    cs.registerOutParameter(1, Types.VARCHAR);
                    s = cs.getString(1);

                    if (s != null) {
                        System.out.println(v_pipe_name + ":" + s);
        } catch (Exception e) {
            try {
            } catch (Exception ex) {
                System.out.println("could not close");

The results is that you can watch what comes across the pipe very fast.  You may notice the timestamps in the output of what's in the demo here:

Thursday, August 01, 2013

SQL Developer meets REST

  SQL Developer is bringing the ability to define all the APEX Listener's REST abilities.  There's many advantages to adding this in.  The first and probably most important to me who typos and doesn't see it, is the ability to test the query or plsql before publishing it and getting a 500 error and not knowing why.
  This screenshot shows many other advantages of adding in.  The REST definition screens will have a fully functioning SQL Worksheet to test and tune the source of the REST call.  That means code insight, explain ,autotrace, sql tuning advisor, grids of data, and anything else.  

  The differences over a normal worksheet are obviously the name of the tab is the handler method and the name of the REST end point.  Then there's the additional tabs for defining in and out parameters.

  Then the other thing I frequently don't get correct is testing the URL out.  I forget if I added a slash or what the module was or typo or something to get the URL wrong.  The new details tab shows what the URL is going to be.

A quick video of making a new REST call in SQL Developer:

Wednesday, July 31, 2013

Using REST to search in JQuery

Building on yesterday's post,  here's adding a search to the same JQuery list: SQL Developer Exchange Requests

SQL Developer Exchange Requests

The REST Template is only different by adding in a bind variable and adjusting the select some:

The other thing I added into the JQuery list is the weight of the searched items.  Which can be seen in the blue bar here:

The javascript changes are also fairly small. I made a function named search which takes in what the text is to search on. Then few lines to bind the function and the listviewbeforefilter event together.

// attach to the before filter in the list
     $( "#results" ).on( "listviewbeforefilter", function ( e, data ) {
          // grab the value of the search field
          var $ul = $( this ),
  $input = $( data.input ),
  value = $input.val();
          // if blanked go back to the recent list
          if ( value == '' ) {
          } else {
            // debugging
              console.log("Loading Search:"+value);

 function search(search){
   console.log("searching for '" + search + "'");
    // out with the old

    // in with the new
      url: "" + search ,
      type: "GET",
      success: function (data) {
        if ( data.items.length == 0 ){ noResults();}
        ko.applyBindings(data, document.getElementById("searchPage"));

Tuesday, July 30, 2013

Combining RESTful, JQuery ,and KnockoutJS

Here's a very easy example of how to join RESTful services with JQuery and some data binding from KnockoutJS.  This is a page that shows the last 100 requested features in the SQL Developer Exchange.

The REST service is very simple as it just strips out a <p> that the editor injects and a little format on the date.

I did this all on so it's nice and easy to share:  The knockoutJS template is the key.  When JSON is returned from the APEX Listener it has a parent node named "items"  The template is bound to the list with this line:

           data-bind="template: { name: 'overview', foreach: items}"

Then for each child under items it will loop and apply the template.  The template is just html inside a script table with the type set to text/html

<!-- Knockout syntax for the template which is the list items -->  
<script id="overview" type="text/html">
<li data-role="list-divider"><!--/ko--></li>
<!-- bind in the link to drill down to the exchange-->
<li> <a data-bind="attr: { href: ''+ id }"  alt="">
<!--ko text: title --><!--/ko--></h2>
<span data-bind="text: description"></span>

Then the Javascript which loads the data and applies the template

var requests = "";

  // When the page init's go load it up.
$(document).on("pageinit", "#recentRequests", function() {
      url: requests ,
      type: "GET",
      success: function (data) {
        // go apply the template to what was retreived
        ko.applyBindings(data, document.getElementById("recentRequests"));

This barely touches the surface of what can be done with just getting the data out of the database via the APEX Listener.  For example,  the query in this REST is returning 100 rows but pagination is set to 25.  I'll get that hooked up and expand this demo later.  As I tinker with more widgets such as I'll write those up also.

Monday, July 29, 2013

Who's on first? Adding Who columns in the Modeler

   The SQL Developer Datamodeler has some great built in transformation.  Let's take a look at the Table Template one.

  I'm going to start with the plain ol' HR schema and we're going to add in the standard who columns to it.

Step 1 is to create a table named "table_template"  There's plenty of options since it's just a normal table in the model.  The two interesting checkboxes which are Engineer and Generate in DDL.  This will make it so this table is not generated when you ask for the DDL or Engineer into a Logical Model.

Step 2 is to add in the who columns we want to propagate to all tables in the model.

Step 3 is to do the transformation.  

Step 4 is to press the Apply button.

The final result is that all the tables in the model now have every column from the table_template.

The transformation itself is fairly straight forward and it's all Javascript:

 /* name of the table to template from */
var t_name = "table_template";
var p_name = "ctemplateID";

template = model.getTableSet().getByName(t_name);
    tcolumns = template.getElements();
    tables = model.getTableSet().toArray();
    /* walk all tables in the model */
    for (var t = 0; t<tables.length;t++){
       table = tables[t];
      // compare name ignoring the case
         for (var i = 0; i < tcolumns.length; i++) {
            column = tcolumns[i];
            col = table.getColumnByProperty(p_name,column.getObjectID());
             col = table.createColumn();
            //set property after copy otherwise it'll be cleared

Friday, July 26, 2013

No DB access? Use REST

The problem.

   There's an arbitrary sized image in a database which needed to be made square, sized into 4 sizes, and have a border to color match the picture ( like iTunes albums ) .  Also, do it in batches so as not to cause unneeded load on the database.  Then a minor detail of there's no direct sqlnet access to the db.

  Here's the current result.  The original picture was an now quite square at 490x604

The result is much more uniform

The list of work.

  Subsetting the list to be done was the simplest part.  A RESTful service defined which ties who has images to what to do produced this list.  This was my test on myself but it can obvious be opened up to do any subset or lower the pagination and process one page at a time.

The original image.

  There's not much to say here since I covered how to get an image out over REST here.

The processing.

  I read a lot of what people were doing to color match in various ways.  The actual code is in java and I can post it if anyone is interested.  However, here's what I basically did.

      Step 1,  Processes EVERY pixel and find the most common Hue ( HSV
      Step 2,  Rewalk the pixels and find every one of that Hue and get the most common Saturation and Brightness
      Step 3,  I found a Java Library named Scalr to do the resize
      Step 4,  Center the resized image and pad to make it square using the computed HSV as the padding color.

  Sounds simple but too me a while to figure the way to do this.  For this test image, it took ~300ms to process and make all the resized versions.  There's probably better/faster ways to do this but this seems to be working for me.
    For the image of me above the dominate color happened to be black which is why there are black bars.  However to prove it works, here's another example.  The color match is on the left and right and I put red lines around the seem on the right.

Putting it back into the Database.

  Now back to using REST.  Now there's 3 images that have been processed and need be updated into the database.  Most of what I've shown has been GET requests of the Apex Listener.  This will require a POST URI handler.  The URI template shows I'll post in the person email address and the size of the image.  This is so I know which column to place the updated image into.

The part not in the template anywhere is where the image will be.  The Apex Listener makes an automatic bind named :body.  This is the body of the file upload.  The other thing the listener does is print out anything in the htp.p buffer.  Which is why calling F on this post works.  I'm using htp.p in this block simply for debugging progress to see what's going where.

  p_id number;
  img BLOB;
  /* bring the LOB to a local variable */
  img := :body;

  /* get the id */
  select person_id into p_id
    from people p
    where email = :email;

  /* print the size for debugging */
  htp.p('Lob size:'|| dbms_lob.getlength(img));

  /* based on which size stuff the image into different columns */
  if ( :size = 80 ) then 

     htp.p('Updating 80');
     update  photos
        set scaled_photo = img
      where person_id = p_id;
  elsif ( :size = 180 ) then 

    htp.p('Updating 180');
     update  photos
        set thumb_photo = img
      where person_id = p_id;
  elsif ( :size = 436 ) then

    htp.p('Updating 436'); 
      update  photos
         set large_thumb_photo = img
       where person_id = p_id;
  end if;


exception when others then

Putting it all together.
   The actual java code that runs it all is not that interesting.  I'm using Jackson to process the JSON of the people's images to process.  There's a few libraries out there for JSON processing this one just seemed easy for me to pick up.
  The posting of the images back is using Apache's HttpClient.  Here's that code.

 HttpClient httpclient = new DefaultHttpClient();
    try {
      // Create the POST client
      HttpPost httppost = new HttpPost(""+ email+"&size="+size);
      // The image scaling is done locally to a file then posted
      InputStreamEntity reqEntity = new InputStreamEntity(new FileInputStream(file), -1);
      // set the POST to be binary content

      System.out.println("executing request " + httppost.getRequestLine());
      HttpResponse response = httpclient.execute(httppost);
      HttpEntity resEntity = response.getEntity();

      String line = null;
      // now print the text from the htp.p debug messages.      
      BufferedReader in = new BufferedReader(new InputStreamReader(resEntity.getContent()));
      while( ( line = in.readLine() ) != null ){
    } finally {
      // shut it down

Thursday, July 25, 2013

Nicer URLs for APEX, yet another option, Part 2

   Here's taking the nicer URLs one more step and I think I have 2 more that will follow next week. I have been doing with some testing with the Online Learning Library .

In case the screenshot is too small, the URL is

The URI Template is easy to follow and fairly portable.  I'm working on a generic way to do this in any app but here's the one we're using for this specific page.


Then the Apex Listener translates those things in  { }  into binds which can be used in this plsql block to run the F procedure and get a page out of Apex.

/* buffer to build up the value */
 p varchar2(32767);


  /* add in app,page, and session */

  /* move along nothing to see here */
  p:= p||'::::';

  /* add in item names */
  p:= p|| :item1;
  if ( :item2 is not null ) Then
    p:= p|| ','||:item2;
  end if;

 /* addin item values */
  p:= p|| :item1v;

  if ( :item2 is not null ) Then
    p:= p|| ','||:item2v;
  end if;


   I have to mention things could be rendered badly if anything is reading owa_util package to get assumptions from the URL.  One thing that I know didn't work for me was the use of apex_util.get_blob_file but that's easy to fix from yesterday's post.

Wednesday, July 24, 2013

Using RESTful to deliver Images for APEX

   Who doesn't want things to be faster.  Here's a very easy way to get images in the database out faster.  The normal way to get images embedded into and Application Express app is to use apex_util.get_blob.  It's tried and test and optimized for doing just that so you may assume it doesn't get faster to get images served out.  Here's a quick screenshot of loading a test page in chrome of 2 identical images and by identical, I mean the same BLOB from the same table with the same ID.

The shorter timeline is the RESTful delivery.  Mileage will of course vary by load and latency but with what I've tested REST delivery is always faster.

For 5 reloads this was the timings I got:

apex_util REST % Faster
259             137     89.05%
362             311     16.40%
250             129     93.80%
245             130     88.46%
269             126     113.49%

Now introducing RESTful type of Media Resource which makes this very easy to do.  Colm initially blogged about it a while back.  In a nutshell, it's a query that wants 2 columns the first being the content type , mime, and the second being a lob or anything to be delivered.  Colm showed how to deliver XML in his post.  This is how to serve up images.

In the URI Template, you'll notice {id} which is the ID of the image to be served and used as a bind to query.  The select is quite simple to follow.

The table for this example is quite simple but obviously could be anything as long as there's a mime and lob to deliver.

CREATE TABLE  myimages