There is this recurring problem we have in GIS: getting road-segments and wanting to show complete roads. The naive approach would we to do something like the following:

insert into street_geoms
select ro.rd_ro_ident, ro.rd_ro_name, ro.com_code, ssdo_aggr_union(sdoaggrtype(rd.ro_geometry, 0.005)) as geom
from rd_road ro, rd_ro_sec ros
where ros.rd_ro_ident = ro.rd_ro_ident
group by ro.rd_ro_ident, ro.rd_ro_name, ro.com_code;

For good measure: we have 45.000+ roads, with a total of 230.000+ road segments. So when that query starts running and starts taking a long time, I started googling. Apparently there are two faster alternatives: SDO_AGGR_CONCAT_LINES and SDO_AGGR_SET_UNION. While the first was really quick, completed in minutes, the result was completely wrong (complete segments were missing). The second might be quicker, but it was really hard to get an idea about any progress, and if it would fail, everything should be lost (rolled back).

So I decided to write a little script, and issue a sql statement for each single road, allowing me to track progress and added restartibility. For each road I issued a statement like:

insert into street_geoms
select ro.rd_ro_ident, ro.rd_ro_name, ro.com_code, sdo_aggr_set_union(CAST(COLLECT(ros.rd_ros_geometry) AS mdsys.SDO_Geometry_Array),0.005) as geom
from rd_road ro, rd_ro_sec ros
where ros.rd_ro_ident = ro.rd_ro_ident
  and ro.rd_ro_ident = 1895101 
group by ro.rd_ro_ident, ro.rd_ro_name, ro.com_code;

I added some ruby code around it, to make sure it tracked the progress and calculated the remaining time, just to have an idea. The first “large” road it stumbled upon literally took hours. It only had to join 39 segments. A simple query learned I had 150+ roads with more segments, and a maximum of 125 segments in the database. I could not just simply ignore them :) So this was not going to work either.

Why would this be so hard? I just wanted to throw all linestrings together into one geometry. How could I do that? Querying the geometries was really easy, so what if I joined the geometries outside of oracle? And wouldn’t that be hard? But there is a simple solution: convert the strings to WKT, and join all LINESTRING in a MULTILINESTRING. This would just be simple string manipulation. I can do that ;)

I had some hiccups with this approach: handling the long strings proved a bit akward (use CLOB instead) and I had to regularly call GC.start to make sure the open cursors were released. And I had to make sure not to build a string literal which was too long (ORA-06550).

But in the end I was able to join the road-sections for the 45.000 + roads in approx 1.5h, which is not blindingly fast, but faster than 1 single SDO_AGGR_SET_UNION operation :) :)

For reference you can see the full code:

class StreetGeom < ActiveRecord::Base
  self.primary_key = 'rd_ro_ident'

def format_time (t)
  t = t.to_i
  sec = t % 60
  min  = (t / 60) % 60
  hour = t / 3600
  sprintf("% 3d:%02d:%02d", hour, min, sec)

def eta(count)
  if count == 0
    "ETA:  --:--:--"
    elapsed = - @start_time
    # eta = elapsed * @total / count - elapsed;
    eta = (elapsed / count) * (@total - count)

    sprintf("ETA: %s", format_time(eta))

all_roads = Road.count
geoms_to_calculate = all_roads - StreetGeom.count
@total = geoms_to_calculate

puts "Joining geometries for #{all_roads} roads [still #{geoms_to_calculate} to do]"

cntr = 1
@start_time =

done = 0

Road.order(:rd_ro_ident).each do |road|
  street_count = StreetGeom.where(rd_ro_ident: road.rd_ro_ident).count
  print "\rConverting #{cntr}/#{all_roads} [ #{eta(done)} ] "
  if street_count == 0
    print "..."

    ## get all geometries in WKT format
    get_geoms_sql = <<-SQL
      select sdo_cs.make_2d(ros.rd_ros_geometry).get_wkt() as wkt_geom from rd_ro_sec ros where ros.rd_ro_ident = #{road.rd_ro_ident}

    cursor = Road.connection.execute(get_geoms_sql)

    line_strings = []

    while row = cursor.fetch
      line_string = row[0].read.to_s
      line_strings << line_string[10..-1]

    insert_sql = <<-SQL
        wkt_str clob;
        wkt_str := 'MULTILINESTRING(#{line_strings.join(", ';\nwkt_str := wkt_str || '")})';
        insert into street_geoms(rd_ro_ident, name, com_code, geom)
        values (#{road.rd_ro_ident}, q'[#{road.rd_ro_name}]', '#{road.com_code}',
             sdo_util.from_wktgeometry(to_clob(wkt_str)) );

    done += 1
    print "_"

  cntr += 1

  # periodically cleanup GC so we release open cursors ...
  # to avoid ORA-1000 errors
  if (cntr % 50) == 0

print "\n"
puts "\n\nDone!"

and I run this script in the rails environment as follows: rails runner lib\tasks\join_road_geometries.rb.

We are in the process of migrating an old GIS system. For our new systems we use POSTGIS. But this one still uses oracle. The data is spanning two countries: Belgium and the Netherlands. Our system does something awful: all data is stored in RD (the dutch coordinate system, using Oracle SRID 90112).

So how do we get data into the system: belgian data is entered as Lambert 72 (oracle srid 327680) and then transformed to 90112.

Our client uses a customised viewer that shows the data either in RD or Lambert72. Now we want to switch to a more generic solution, and show the data in WGS84. We are using oracle 11, so my initial process was the following

  • extract belgian data from tables, convert back to 327680 (SDO_CS.transform(geom, 327680))
  • set the SRID to 31370 (which is the correct/best srid for belgium –it has the correct transformation to wgs84) as follows: update be_geoms bg set bg.geom.sdo_srid = 31370 (so without transformation)
  • for dutch data I just set it to 28992
  • and then I transform both to WGS!

Easy! done! ready! However … I was not … The data was not positioned correctly. So I checked the definition in MDSYS.CS_SRS for both 28992 and 31370 and compared it to and lo and behold: both where wrong. So now I had to update them.

Updating EPSG:31370

delete from mdsys.cs_srs where srid=31370;
Insert into MDSYS.CS_SRS (CS_NAME,SRID,AUTH_SRID,AUTH_NAME,WKTEXT,CS_BOUNDS,WKTEXT3D) values ('Belge 1972 / Belgian Lambert 72',31370,31370,'IGN Brussels','PROJCS["Belge 1972 / Belgian Lambert 72", GEOGCS [ "Belge 1972", DATUM ["Reseau National Belge 1972 (EPSG ID 6313)", SPHEROID ["International 1924 (EPSG ID 7022)", 6378388.0, 297.0], -106.869,52.2978,-103.724,0.3366,-0.457,1.8422,-1.2747], PRIMEM [ "Greenwich", 0.000000 ], UNIT ["Decimal Degree", 0.0174532925199433]], PROJECTION ["Lambert Conformal Conic"], PARAMETER ["Latitude_Of_Origin", 90.0], PARAMETER ["Central_Meridian", 4.3674866666666667], PARAMETER ["Standard_Parallel_1", 51.1666672333333333], PARAMETER ["Standard_Parallel_2", 49.8333339], PARAMETER ["False_Easting", 150000.013], PARAMETER ["False_Northing", 5400088.438], UNIT ["Meter", 1.0]]',null,'PROJCS[
  "Belge 1972 / Belgian Lambert 72",
  GEOGCS["Belge 1972",
    DATUM["Reseau National Belge 1972",
        "International 1924",
        AUTHORITY["EPSG", "7022"]],
      AUTHORITY["EPSG", "6313"]],
    PRIMEM["Greenwich", 0.000000, AUTHORITY["EPSG","8901"]],
    UNIT["degree (supplier to define representation)", 0.0174532925199433, AUTHORITY["EPSG", "9122"]],
    AXIS["Lat", NORTH],
    AXIS["Long", EAST],
    AUTHORITY["EPSG", "4313"]],
  PROJECTION ["Lambert Conformal Conic"],
  PARAMETER ["Latitude_Of_Origin", 90.0],
  PARAMETER ["Central_Meridian", 4.3674866666666667],
  PARAMETER ["Standard_Parallel_1", 51.1666672333333333],
  PARAMETER ["Standard_Parallel_2", 49.8333339],
  PARAMETER ["False_Easting", 150000.013],
  PARAMETER ["False_Northing", 5400088.438],
  UNIT["metre", 1.0, AUTHORITY["EPSG", "9001"]],
  AXIS["X", EAST],
  AUTHORITY["EPSG", "31370"]]');

… and this worked and now my transformation for Lambert is correct!

Updating EPSG:28992

… proved to be a little trickier. I assumed I could just reuse the same method as for the belgian coordinate system (yes, I know, assume = ass-u-me).

I was unable to just delete or update 28992 because I got an error that a child record existed: ORA-02292 with reason COORD_OPERATION_FOREIGN_SOURCE. Googling this revealed nothing at all.

So I had to dig deeper. And deeper. Actually MDSYS.CS_SRS is actually a view which tries to update the underlying tables. And the TOWGS84 coordinates, which I had to change/update, are stored in SDO_DATUM. So after some searching, it actually proved to be quite easy. To updated the EPSG:28992, I just had to do:

update mdsys.sdo_datums set
  shift_x = 565.417,
  shift_y = 50.3319,
  shift_z = 465.552,
  rotate_x = -0.398957,
  rotate_y = 0.343988,
  rotate_z = -1.8774,
  scale_adjust = 4.0725
where datum_id = 6289;


My first initial (naive) assumption was that the SDO_CS.UPDATE_... functions would actually retrieve the latest EPSG definitions, unfortunately no such luck :) :)

Stuff like this makes me appreciate PostGIS even more.

In a project we built, we are using que for doing our background-jobs, and there is a very simple (but sufficient) and clean web-ui, called que-web, allowing us to monitor the status of the jobs online.

And normally, you just include it in your project by adding the gem, and then adding the following to your config/routes.rb :

require "que/web"
mount Que::Web => "/que"

But, this is completely open and unauthenticated. So we use devise, and it is really easy to limit a route to authenticated users:

require "que/web"
authenticate :user do 
  mount Que::Web => "/que"

At least this limits the accessability to logged in users. But we wanted it to be available only to admin-users.
So I thought I had to resort to defining my own constraint-class, as follows

class CanSeeQueConstraint
  def matches?(request)
    # determine if current user is allowed to see que

and in the routes write it as follows

require 'can_see_que_constraint'
mount Que::Web, at: '/que', constraints: 

The problem was: how do I get to the current user, in a constraint class? So I took a peek at how the authenticate block in devise works, and apparently there is an easier option: the authenticate block takes a lambda, where you can test the currently authenticated user. Woah! Just what we need. So we wrote the following to only allow our adminstrators to see/manage our background jobs:

authenticate :user, lambda {|u| u.roles.include?("admin") } do
  mount Que::Web, at: 'que'

Normally in rails, you can only render views inside of the controller. But what if you want to render a view somewhere else? For instance we wanted to generate xml-files using views. Haml can be used to describe xml just as well as plain html.

There is a gem called render_anywhere that allows just that. So how does this work, for example:

class Organisation < ActiveRecord::Base

  has_many :members

  include RenderAnywhere

  def to_xml
    render partial: "#{self.to_partial_path}", object: self, layout: 'my_xml_layout'

We had a little problem when using partials though.

Normally if you type something like

= render @member

it will ask the partial path from the model (@member.to_partial_path), but somehow this always got prefixed with render_anywhere. The gem creates a dummy RenderingController in the RenderAnywhere namespace, so apparently it will look for the following view:


In our case, I did not want to use the render_anywhere subfolder. It took me a while to figure out how to overrule this, but in essence it is pretty simple: rails uses the namespace of the rendering controller to prefix the path. Some deep googling proved that any controller has a method called _prefixes which lists all the prefixes for that class.

We can easily verify this in the rails console:

:001 > RenderAnywhere::RenderingController._prefixes
=> ["render_anywhere/rendering"]

So if we could overrule _prefixes to just return ["rendering"] … Mmmmmm fork the code of render_anywhere? Or …

There is another option: render_anywhere allows you to supply your own RenderingController and will use that instead if found in the context where the RenderAnywhere code is included.

So, if you write something like:

class Organisation < ActiveRecord::Base

  has_many :members

  include RenderAnywhere

  class RenderingController < RenderAnywhere::RenderingController

    def self._prefixes


  def to_xml
    render partial: "#{self.to_partial_path}", object: self, layout: 'my_xml_layout'

it will look for a view called members/member. Woot. To specify a different sub-folder you can adapt the _prefixes method as you wish :)

Developing rails websites with a geographic component we rely heavily on Postgis, so we use activerecord-postgis-adapter for the Postgis support, and I always use schema_plus because it allows me to define views. Until recently, I always had to use the structure.sql instead of the schema.rb because the geometric columns did not dump correctly.

But for a while now, activerecord-postgis-adapter handles this correctly and so we use the schema.rb file again. Only to discover a “new” error:

ActiveRecord::StatementInvalid: PG::DependentObjectsStillExist: ERROR:  cannot drop view geography_columns because extension postgis requires it
HINT:  You can drop extension postgis instead.
: DROP VIEW IF EXISTS "geography_columns"

Apparently specific Postgis views are also dumped in the schema file, and those views obviously cannot simply be re-created.

A very naive solution I kept using was to comment those create_view lines in our schema.rb file. But apparently there is a much better solution: you can configure which tables and views schema_plus should ignore.

So I added an initializer in /initializers/schema_dumper.rb with the following content:

ActiveRecord::SchemaDumper.ignore_tables = [
   "geography_columns", "geometry_columns", "spatial_ref_sys", "raster_columns", "raster_overviews"

And now my schema.rb is correct, and simple commands as rake db:setup or rake db:test:prepare just work. Hehe.

I have a very weird problem with my geoserver+oracle, when deployed on a Windows 2012R2 server (see here), and in attempting to solve that, I upgraded the geoserver from 2.6.3 to 2.7.1, hoping that that would fix that.

Sometimes fairy tales come true, but in this case it did not help, unfortunately. The 2.7.1 did render a lot quicker, except one layer which did not render at all anymore.

My style could not render with the error The requested Style can not be used with this layer. The style specifies an attribute of <missing attribute name>. Checking the layer in geoserver, I could see it was no longer to determine any of the attributes for the given table.

Further investigation in the logfile revealed the following (cryptic) error:

Failure occurred while looking up the primary key with finder: org.geotools.jdbc.HeuristicPrimaryKeyFinder@24cf7139

java.sql.SQLException: Exhausted Resultset

Mmmmmm. Luckily my google-fu revealed a linked issue, and simple solution:

update the driver from ojdbc14.jar to the newer ojdbc7.jar fixes this problem.

Hehe :)

Updating geoserver did not fix my problem: my layer still had some duplicate columnnames. This might not be such a big problem: everything is drawn correctly, WMS calls work, but WFS calls gave the irritating yet predictable error ORA-00918: column ambiguously defined. Annoying.

So how does one find column-names for a table in oracle? With a query like:

select * from dba_tab_columns where table_name = 'YOUR_TABLE_NAME';

and all of a sudden I saw the same set of column-names, with some duplication. Apparently my oracle database contains the table twice, in two different schema’s. Since my user had the permissions to access the other schema, it seems geoserver does not limit the query to the (specified) schema at all.

The fix then was easy: make the other schema unaccessible. In my case the second schema was for testing purposes, so I could just delete it.

At my current job, we make GIS websites, using rails and geoserver. I develop on mac, and for some clients we need to deploy on windows. One client is still using an Oracle database, while in general I prefer to work with postgis databases, and also geoserver offers better support when using postgis.

So: when working locally I got a really weird phenomenon in my geoserver: it duplicated various oracle columns. Generally, for viewing not a problem but when using WFS I got the “column ambigously defined”, and using Oracle SQL Views did not work (it went looking for meta data?) and the Geoserver SQL Views were painfully slow.

But on my clients server I installed Geoserver 2.6.3 and the oracle stuff just worked. Woot :) So I had to upgrade my ancient 2.3.3 geoserver. It is running inside a tomcat. Upgrading seemed easy enough: copy the old geoserver folder somewhere (actually you would only need the data folder and the web.xml but I am lazy/extra safe like that), and drop the new war, and theoretically, we should be good to go.

Except … I got this peculiar error in my log-file

SEVERE: Error listenerStart

WTF! Thanks to some googling, I added a file to my geoserver\WEB-INF\classes with the following content:

org.apache.catalina.core.ContainerBase.[Catalina].level = DEBUG
org.apache.catalina.core.ContainerBase.[Catalina].handlers = java.util.logging.ConsoleHandler

restarted my tomcat, and the following appeared:

SEVERE: Error configuring application listener of class org.geoserver.platform.GeoServerHttpSessionListenerProxy
java.lang.UnsupportedClassVersionError: org/geoserver/platform/GeoServerHttpSessionListenerProxy : Unsupported major.minor version 51.0 (unable to load class org.geoserver.platform.GeoServerHttpSessionListenerProxy)

Now what the hell cryptic error is that? But apparently this is a very compact way to say this code needs java 1.7 and I am still using java 1.6 (I am looking at you Apple). Updating now :)

The simplest way to select all distinct values of a column is, somewhat unintuitively:


this runs the query select distinct project from visits, and returns an array of strings. Exactly what you need, except … I want it to be paginated. So we


… and that completely bombs: we now get an array of numbers?

So we try something else, and write:'distinct project') 

which runs the good query, but returns an array of Visit‘s with only the project filled in. I can live with that. And then pagination (using the kaminari gem) is again as expected:'distinct project').page(params[:page])

Nice :)


I have a single bootstrap modal, which is called from different places, and so the modal contains some data-* attributes I want to set before showing it. Just using the .data() offered by jquery does not work.

Detailed example

Suppose you have mark-up like this:

<div data-some-important-value="123">

Asking the value is quite easy:


And, according to the documentation, setting the data on a DOM element, should be as easy as

$('[data-some-important-value]').data('some-important-value', 'new-value')

If you would try this in the console, you could verify that does not work. This is where it gets confusing (to me). Apparently the .data() offered by jquery existed before the HTML5 data-* elements did, and they nicely integrated them. But the data-* are only loaded once, and never written back to the document.

To still be able to do this, use the .attr() method instead:

$('[data-some-important-value]').attr('data-some-important-value', 'new-value')

Now I only have to include one modal “template” in my HTML, and set the data-* attributes to customize the behaviour.