When you encounter SSL errors when installing gems on Windows, the easiest workaround is to change your sources from https://... to http://.... But … I am an avid user/fan of rails-assets.org and today I suddenly started getting the error on their domain.

So at first I feared that rails-assets had stopped as foreseen (in this ticket), but the site was still reachable, and actually they switched (imho just two days ago) to a new maintainer, which is awesome: the future of rails-assets is safe for now.

But there is no rose without a thorn and now rails-assets enforces TLS (which is actually a good thing), so it is always SSL and gem cannot ignore SSL anymore. Doh! So I was stuck on windows.

I tried to make gem command ignore ssl errors regardless, by creating c:\ProgramData\gemrc with the following content:

---
:ssl_verify_mode: 0 

and that partly worked: I was now able to fetch the index, but now I received the SSL error on the first gem retrieved from rails-assets, so I was still not in the clear. I had to make sure the SSL verification actually worked!

Fortunately, after some googling this proved easier then expected! The root cause is that ruby on windows (or openssl) has no default root certificate. So I found a good description how to fix that on windows.

I used the boring/easy/manual approach, in short:

  • download the cacert.pem file from http://curl.haxx.se/ca/cacert.pem. I saved this to my ruby folder (e.g. c:\ruby21).
  • add an environment variable SSL_CERT_FILE, so ruby can pick it up. E.g. in your command prompt type set SSL_CERT_FILE=C:\ruby21\cacert.pem. To make this a permanent setting, add this to your environment variables.

In a system I am helping to develop a person can be linked to a myriad of things, including themselves, so we use a relation table PersonRelation defined as follows

 class PersonRelation
   belongs_to :person
   belongs_to :personifiable, :polymorphic => true
   belongs_to :person_relation_type
 end 

So a person could be linked to different “personifiable” things, and sometimes the meaning of the relation can change (e.g. a person could be an owner or a renter –expressed by the relation type).

In our datamodel, a person is actually a “legal person”, so it could also be an organisation, and an organisation can have contacts. Logically a contact belongs to one or more organisation by following the association in the reverse direction.

When using rails 3+ up until 4.0 we wrote the association as follows:

  has_and_belongs_to_many :contacts,
                          :join_table => "person_relations",
                          :class_name => "Person",
                          :foreign_key => "person_id",
                          :association_foreign_key => "personifiable_id",
                          :readonly => false,
                          :conditions => ["personifiable_type = ? and people.archived_at is null and person_relations.archived_at is null and person_relation_type_id=?", "Person", PersonRelationType::CONTACT],
                          :insert_sql => proc {|record| "INSERT INTO person_relations(person_id, personifiable_id, personifiable_type, person_relation_type_id, created_at, updated_at) VALUES('#{self.id}', '#{record.id}', 'Person', 6, current_timestamp, current_timestamp)" }

  has_and_belongs_to_many :organisations,
                          :join_table => "person_relations",
                          :class_name => "Person",
                          :association_foreign_key => "person_id",
                          :foreign_key => "personifiable_id",
                          :readonly => false,
                          :conditions => ["personifiable_type = ? and people.archived_at is null and person_relations.archived_at is null and person_relation_type_id=?", "Person", PersonRelationType::CONTACT],
                          :insert_sql => proc {|record| "INSERT INTO person_relations(person_id, personifiable_id, personifiable_type, person_relation_type_id, created_at, updated_at) VALUES('#{record.id}', '#{self.id}', 'Person', 6, current_timestamp, current_timestamp)" }

Pretty complicated, but it does the job :) Now we keep getting deprecation warning because :conditions, :insert_sql, :finder_sql are all deprecated and essentially removed in rails 4.1+. I kept postponing because it seemed like really hard to translate. But one suggestion in the error-message is to use has_many :through instead.

We already had the following statement in our Person model:

  has_many :person_relations

so for a simple linked model, we could just write

  has_many :parcels, through: :person_relations, source: :personifiable, source_type: 'Parcel'

and likewise, for our contacts, we just have to add a condition, but on PersonRelation, so we write:

  has_many :contacts,
           -> { where("person_relations.person_relation_type_id" => PersonRelationType::CONTACT)},
           through: :person_relations,
           class_name: "Person",
           source: :personifiable,
           source_type: 'Person'

I am not really happy with the explicit mention of person_relations in the condition, this might impact chainability later on, but I am not sure how I could handle that differently. For now this does the job really cleanly.

Now the problem is how to follow the reverse association (from the contacts to the organisations), and actually this also proves pretty simple. If I “reverse” the person-relations first, than we can use that as the through table:

  has_many :reverse_person_relations, as: :personifiable, class_name: 'PersonRelation'
  has_many :organisations,
            -> { where("person_relations.person_relation_type_id" => PersonRelationType::CONTACT)},
           through: :reverse_person_relations,
           class_name: "Person",
           source: :person

At first I felt that the deprecation of :insert_sql, :delete_sql, :finder_sql would be an insurmountable hurdle, but actually it proved pretty simple to fix and in the end a lot easier and even more readable. Nice :+1:

There is this recurring problem we have in GIS: getting road-segments and wanting to show complete roads. The naive approach would we to do something like the following:

insert into street_geoms
select ro.rd_ro_ident, ro.rd_ro_name, ro.com_code, ssdo_aggr_union(sdoaggrtype(rd.ro_geometry, 0.005)) as geom
from rd_road ro, rd_ro_sec ros
where ros.rd_ro_ident = ro.rd_ro_ident
group by ro.rd_ro_ident, ro.rd_ro_name, ro.com_code;

For good measure: we have 45.000+ roads, with a total of 230.000+ road segments. So when that query starts running and starts taking a long time, I started googling. Apparently there are two faster alternatives: SDO_AGGR_CONCAT_LINES and SDO_AGGR_SET_UNION. While the first was really quick, completed in minutes, the result was completely wrong (complete segments were missing). The second might be quicker, but it was really hard to get an idea about any progress, and if it would fail, everything should be lost (rolled back).

So I decided to write a little script, and issue a sql statement for each single road, allowing me to track progress and added restartibility. For each road I issued a statement like:

insert into street_geoms
select ro.rd_ro_ident, ro.rd_ro_name, ro.com_code, sdo_aggr_set_union(CAST(COLLECT(ros.rd_ros_geometry) AS mdsys.SDO_Geometry_Array),0.005) as geom
from rd_road ro, rd_ro_sec ros
where ros.rd_ro_ident = ro.rd_ro_ident
  and ro.rd_ro_ident = 1895101 
group by ro.rd_ro_ident, ro.rd_ro_name, ro.com_code;

I added some ruby code around it, to make sure it tracked the progress and calculated the remaining time, just to have an idea. The first “large” road it stumbled upon literally took hours. It only had to join 39 segments. A simple query learned I had 150+ roads with more segments, and a maximum of 125 segments in the database. I could not just simply ignore them :) So this was not going to work either.

Why would this be so hard? I just wanted to throw all linestrings together into one geometry. How could I do that? Querying the geometries was really easy, so what if I joined the geometries outside of oracle? And wouldn’t that be hard? But there is a simple solution: convert the strings to WKT, and join all LINESTRING in a MULTILINESTRING. This would just be simple string manipulation. I can do that ;)

I had some hiccups with this approach: handling the long strings proved a bit akward (use CLOB instead) and I had to regularly call GC.start to make sure the open cursors were released. And I had to make sure not to build a string literal which was too long (ORA-06550).

But in the end I was able to join the road-sections for the 45.000 + roads in approx 1.5h, which is not blindingly fast, but faster than 1 single SDO_AGGR_SET_UNION operation :) :)

For reference you can see the full code:

class StreetGeom < ActiveRecord::Base
  self.primary_key = 'rd_ro_ident'
end


def format_time (t)
  t = t.to_i
  sec = t % 60
  min  = (t / 60) % 60
  hour = t / 3600
  sprintf("% 3d:%02d:%02d", hour, min, sec)
end


def eta(count)
  if count == 0
    "ETA:  --:--:--"
  else
    elapsed = Time.now - @start_time
    # eta = elapsed * @total / count - elapsed;
    eta = (elapsed / count) * (@total - count)

    sprintf("ETA: %s", format_time(eta))
  end
end


all_roads = Road.count
geoms_to_calculate = all_roads - StreetGeom.count
@total = geoms_to_calculate

puts "Joining geometries for #{all_roads} roads [still #{geoms_to_calculate} to do]"


cntr = 1
@start_time = Time.now

done = 0


Road.order(:rd_ro_ident).each do |road|
  street_count = StreetGeom.where(rd_ro_ident: road.rd_ro_ident).count
  print "\rConverting #{cntr}/#{all_roads} [ #{eta(done)} ] "
  if street_count == 0
    print "..."
    $stdout.flush

    ## get all geometries in WKT format
    get_geoms_sql = <<-SQL
      select sdo_cs.make_2d(ros.rd_ros_geometry).get_wkt() as wkt_geom from rd_ro_sec ros where ros.rd_ro_ident = #{road.rd_ro_ident}
    SQL

    cursor = Road.connection.execute(get_geoms_sql)

    line_strings = []

    while row = cursor.fetch
      line_string = row[0].read.to_s
      line_strings << line_string[10..-1]
    end


    insert_sql = <<-SQL
      DECLARE
        wkt_str clob;
      BEGIN
        wkt_str := 'MULTILINESTRING(#{line_strings.join(", ';\nwkt_str := wkt_str || '")})';
        insert into street_geoms(rd_ro_ident, name, com_code, geom)
        values (#{road.rd_ro_ident}, q'[#{road.rd_ro_name}]', '#{road.com_code}',
             sdo_util.from_wktgeometry(to_clob(wkt_str)) );
      END;
    SQL

    Road.connection.execute(insert_sql)
    done += 1
  else
    print "_"
  end

  cntr += 1

  # periodically cleanup GC so we release open cursors ...
  # to avoid ORA-1000 errors
  if (cntr % 50) == 0
    GC.start
  end
end

print "\n"
puts "\n\nDone!"

and I run this script in the rails environment as follows: rails runner lib\tasks\join_road_geometries.rb.

We are in the process of migrating an old GIS system. For our new systems we use POSTGIS. But this one still uses oracle. The data is spanning two countries: Belgium and the Netherlands. Our system does something awful: all data is stored in RD (the dutch coordinate system, using Oracle SRID 90112).

So how do we get data into the system: belgian data is entered as Lambert 72 (oracle srid 327680) and then transformed to 90112.

Our client uses a customised viewer that shows the data either in RD or Lambert72. Now we want to switch to a more generic solution, and show the data in WGS84. We are using oracle 11, so my initial process was the following

  • extract belgian data from tables, convert back to 327680 (SDO_CS.transform(geom, 327680))
  • set the SRID to 31370 (which is the correct/best srid for belgium –it has the correct transformation to wgs84) as follows: update be_geoms bg set bg.geom.sdo_srid = 31370 (so without transformation)
  • for dutch data I just set it to 28992
  • and then I transform both to WGS!

Easy! done! ready! However … I was not … The data was not positioned correctly. So I checked the definition in MDSYS.CS_SRS for both 28992 and 31370 and compared it to epsg.io and lo and behold: both where wrong. So now I had to update them.

Updating EPSG:31370

delete from mdsys.cs_srs where srid=31370;
Insert into MDSYS.CS_SRS (CS_NAME,SRID,AUTH_SRID,AUTH_NAME,WKTEXT,CS_BOUNDS,WKTEXT3D) values ('Belge 1972 / Belgian Lambert 72',31370,31370,'IGN Brussels www.ngi.be/html-files/french/0038.html','PROJCS["Belge 1972 / Belgian Lambert 72", GEOGCS [ "Belge 1972", DATUM ["Reseau National Belge 1972 (EPSG ID 6313)", SPHEROID ["International 1924 (EPSG ID 7022)", 6378388.0, 297.0], -106.869,52.2978,-103.724,0.3366,-0.457,1.8422,-1.2747], PRIMEM [ "Greenwich", 0.000000 ], UNIT ["Decimal Degree", 0.0174532925199433]], PROJECTION ["Lambert Conformal Conic"], PARAMETER ["Latitude_Of_Origin", 90.0], PARAMETER ["Central_Meridian", 4.3674866666666667], PARAMETER ["Standard_Parallel_1", 51.1666672333333333], PARAMETER ["Standard_Parallel_2", 49.8333339], PARAMETER ["False_Easting", 150000.013], PARAMETER ["False_Northing", 5400088.438], UNIT ["Meter", 1.0]]',null,'PROJCS[
  "Belge 1972 / Belgian Lambert 72",
  GEOGCS["Belge 1972",
    DATUM["Reseau National Belge 1972",
      SPHEROID[
        "International 1924",
        6378388.0,
        297.0,
        AUTHORITY["EPSG", "7022"]],
      TOWGS84[-106.869,52.2978,-103.724,0.3366,-0.457,1.8422,-1.2747],
      AUTHORITY["EPSG", "6313"]],
    PRIMEM["Greenwich", 0.000000, AUTHORITY["EPSG","8901"]],
    UNIT["degree (supplier to define representation)", 0.0174532925199433, AUTHORITY["EPSG", "9122"]],
    AXIS["Lat", NORTH],
    AXIS["Long", EAST],
    AUTHORITY["EPSG", "4313"]],
  PROJECTION ["Lambert Conformal Conic"],
  PARAMETER ["Latitude_Of_Origin", 90.0],
  PARAMETER ["Central_Meridian", 4.3674866666666667],
  PARAMETER ["Standard_Parallel_1", 51.1666672333333333],
  PARAMETER ["Standard_Parallel_2", 49.8333339],
  PARAMETER ["False_Easting", 150000.013],
  PARAMETER ["False_Northing", 5400088.438],
  UNIT["metre", 1.0, AUTHORITY["EPSG", "9001"]],
  AXIS["X", EAST],
  AXIS["Y", NORTH],
  AUTHORITY["EPSG", "31370"]]');

… and this worked and now my transformation for Lambert is correct!

Updating EPSG:28992

… proved to be a little trickier. I assumed I could just reuse the same method as for the belgian coordinate system (yes, I know, assume = ass-u-me).

I was unable to just delete or update 28992 because I got an error that a child record existed: ORA-02292 with reason COORD_OPERATION_FOREIGN_SOURCE. Googling this revealed nothing at all.

So I had to dig deeper. And deeper. Actually MDSYS.CS_SRS is actually a view which tries to update the underlying tables. And the TOWGS84 coordinates, which I had to change/update, are stored in SDO_DATUM. So after some searching, it actually proved to be quite easy. To updated the EPSG:28992, I just had to do:

update mdsys.sdo_datums set
  shift_x = 565.417,
  shift_y = 50.3319,
  shift_z = 465.552,
  rotate_x = -0.398957,
  rotate_y = 0.343988,
  rotate_z = -1.8774,
  scale_adjust = 4.0725
where datum_id = 6289;

EXECUTE SDO_CS.UPDATE_WKTS_FOR_EPSG_DATUM(6289);

My first initial (naive) assumption was that the SDO_CS.UPDATE_... functions would actually retrieve the latest EPSG definitions, unfortunately no such luck :) :)

Stuff like this makes me appreciate PostGIS even more.

In a project we built, we are using que for doing our background-jobs, and there is a very simple (but sufficient) and clean web-ui, called que-web, allowing us to monitor the status of the jobs online.

And normally, you just include it in your project by adding the gem, and then adding the following to your config/routes.rb :

require "que/web"
mount Que::Web => "/que"

But, this is completely open and unauthenticated. So we use devise, and it is really easy to limit a route to authenticated users:

require "que/web"
authenticate :user do 
  mount Que::Web => "/que"
end

At least this limits the accessability to logged in users. But we wanted it to be available only to admin-users.
So I thought I had to resort to defining my own constraint-class, as follows

class CanSeeQueConstraint
  def matches?(request)
    # determine if current user is allowed to see que
  end
end

and in the routes write it as follows

require 'can_see_que_constraint'
mount Que::Web, at: '/que', constraints: CanSeeQueConstraint.new 

The problem was: how do I get to the current user, in a constraint class? So I took a peek at how the authenticate block in devise works, and apparently there is an easier option: the authenticate block takes a lambda, where you can test the currently authenticated user. Woah! Just what we need. So we wrote the following to only allow our adminstrators to see/manage our background jobs:

authenticate :user, lambda {|u| u.roles.include?("admin") } do
  mount Que::Web, at: 'que'
end

Normally in rails, you can only render views inside of the controller. But what if you want to render a view somewhere else? For instance we wanted to generate xml-files using views. Haml can be used to describe xml just as well as plain html.

There is a gem called render_anywhere that allows just that. So how does this work, for example:

class Organisation < ActiveRecord::Base

  has_many :members

  include RenderAnywhere

  def to_xml
    render partial: "#{self.to_partial_path}", object: self, layout: 'my_xml_layout'
  end
end

We had a little problem when using partials though.

Normally if you type something like

= render @member

it will ask the partial path from the model (@member.to_partial_path), but somehow this always got prefixed with render_anywhere. The gem creates a dummy RenderingController in the RenderAnywhere namespace, so apparently it will look for the following view:

render_anywhere/members/member

In our case, I did not want to use the render_anywhere subfolder. It took me a while to figure out how to overrule this, but in essence it is pretty simple: rails uses the namespace of the rendering controller to prefix the path. Some deep googling proved that any controller has a method called _prefixes which lists all the prefixes for that class.

We can easily verify this in the rails console:

:001 > RenderAnywhere::RenderingController._prefixes
=> ["render_anywhere/rendering"]

So if we could overrule _prefixes to just return ["rendering"] … Mmmmmm fork the code of render_anywhere? Or …

There is another option: render_anywhere allows you to supply your own RenderingController and will use that instead if found in the context where the RenderAnywhere code is included.

So, if you write something like:

class Organisation < ActiveRecord::Base

  has_many :members

  include RenderAnywhere

  class RenderingController < RenderAnywhere::RenderingController

    def self._prefixes
      ["rendering"]
    end

  end

  def to_xml
    render partial: "#{self.to_partial_path}", object: self, layout: 'my_xml_layout'
  end
end

it will look for a view called members/member. Woot. To specify a different sub-folder you can adapt the _prefixes method as you wish :)

Developing rails websites with a geographic component we rely heavily on Postgis, so we use activerecord-postgis-adapter for the Postgis support, and I always use schema_plus because it allows me to define views. Until recently, I always had to use the structure.sql instead of the schema.rb because the geometric columns did not dump correctly.

But for a while now, activerecord-postgis-adapter handles this correctly and so we use the schema.rb file again. Only to discover a “new” error:

ActiveRecord::StatementInvalid: PG::DependentObjectsStillExist: ERROR:  cannot drop view geography_columns because extension postgis requires it
HINT:  You can drop extension postgis instead.
: DROP VIEW IF EXISTS "geography_columns"

Apparently specific Postgis views are also dumped in the schema file, and those views obviously cannot simply be re-created.

A very naive solution I kept using was to comment those create_view lines in our schema.rb file. But apparently there is a much better solution: you can configure which tables and views schema_plus should ignore.

So I added an initializer in /initializers/schema_dumper.rb with the following content:

ActiveRecord::SchemaDumper.ignore_tables = [
   "geography_columns", "geometry_columns", "spatial_ref_sys", "raster_columns", "raster_overviews"
]

And now my schema.rb is correct, and simple commands as rake db:setup or rake db:test:prepare just work. Hehe.

I have a very weird problem with my geoserver+oracle, when deployed on a Windows 2012R2 server (see here), and in attempting to solve that, I upgraded the geoserver from 2.6.3 to 2.7.1, hoping that that would fix that.

Sometimes fairy tales come true, but in this case it did not help, unfortunately. The 2.7.1 did render a lot quicker, except one layer which did not render at all anymore.

My style could not render with the error The requested Style can not be used with this layer. The style specifies an attribute of <missing attribute name>. Checking the layer in geoserver, I could see it was no longer to determine any of the attributes for the given table.

Further investigation in the logfile revealed the following (cryptic) error:

Failure occurred while looking up the primary key with finder: org.geotools.jdbc.HeuristicPrimaryKeyFinder@24cf7139

java.sql.SQLException: Exhausted Resultset

Mmmmmm. Luckily my google-fu revealed a linked issue, and simple solution:

update the driver from ojdbc14.jar to the newer ojdbc7.jar fixes this problem.

Hehe :)

Updating geoserver did not fix my problem: my layer still had some duplicate columnnames. This might not be such a big problem: everything is drawn correctly, WMS calls work, but WFS calls gave the irritating yet predictable error ORA-00918: column ambiguously defined. Annoying.

So how does one find column-names for a table in oracle? With a query like:

select * from dba_tab_columns where table_name = 'YOUR_TABLE_NAME';

and all of a sudden I saw the same set of column-names, with some duplication. Apparently my oracle database contains the table twice, in two different schema’s. Since my user had the permissions to access the other schema, it seems geoserver does not limit the query to the (specified) schema at all.

The fix then was easy: make the other schema unaccessible. In my case the second schema was for testing purposes, so I could just delete it.

At my current job, we make GIS websites, using rails and geoserver. I develop on mac, and for some clients we need to deploy on windows. One client is still using an Oracle database, while in general I prefer to work with postgis databases, and also geoserver offers better support when using postgis.

So: when working locally I got a really weird phenomenon in my geoserver: it duplicated various oracle columns. Generally, for viewing not a problem but when using WFS I got the “column ambigously defined”, and using Oracle SQL Views did not work (it went looking for meta data?) and the Geoserver SQL Views were painfully slow.

But on my clients server I installed Geoserver 2.6.3 and the oracle stuff just worked. Woot :) So I had to upgrade my ancient 2.3.3 geoserver. It is running inside a tomcat. Upgrading seemed easy enough: copy the old geoserver folder somewhere (actually you would only need the data folder and the web.xml but I am lazy/extra safe like that), and drop the new war, and theoretically, we should be good to go.

Except … I got this peculiar error in my log-file

SEVERE: Error listenerStart

WTF! Thanks to some googling, I added a file logging.properties to my geoserver\WEB-INF\classes with the following content:

org.apache.catalina.core.ContainerBase.[Catalina].level = DEBUG
org.apache.catalina.core.ContainerBase.[Catalina].handlers = java.util.logging.ConsoleHandler

restarted my tomcat, and the following appeared:

SEVERE: Error configuring application listener of class org.geoserver.platform.GeoServerHttpSessionListenerProxy
java.lang.UnsupportedClassVersionError: org/geoserver/platform/GeoServerHttpSessionListenerProxy : Unsupported major.minor version 51.0 (unable to load class org.geoserver.platform.GeoServerHttpSessionListenerProxy)

Now what the hell cryptic error is that? But apparently this is a very compact way to say this code needs java 1.7 and I am still using java 1.6 (I am looking at you Apple). Updating now :)