Jenkins cookbook for CHECK

We are using Jenkins Continuous Integration Server for automating tasks that would otherwise take days and generate more errors surely. We build the database server and web infrastructure daily for our most important application (legacy with asp classic,NET 2.0/3.5 parts and a lot of Oracle 11g PL/SQL components). When completed, a set of integration tests are run with selenium.
Diagram of CHECK workflow
Deployment workflow

How we came to this

While developing on a dedicated test server for years we encountered integration problems frequently when deploying code to production. First we started using Nant to create automated deployment. This way we could guarantee deploying the software the same way every time. The real problem was that the Oracle test database was starting to differ too much from the production site. To make sure we always used the same infrastructure I took up the laborious task of creating scripts for every object in the Oracle database and put them under version control. Finally, I also created all the scripts to create the Oracle database from the command line and bundled them with the object creation scripts.By then (december 2014) we had to migrate the entire application to another infrastructure provider, so I used the new scripts to create the new environment there from scratch. The servers are all virtual, using Virtualbox technology. For orchestration I used Vagrant, the amazing tool with config files in Ruby) style by Mitchell Hashimoto, coupled with Chef (also Ruby) for further automation and provisioning when the virtual machine is up and running.In march 2015 I created nightly build jobs in Jenkins to create the database and web infrastructures.

The steps in this build:

Step 1: Deploying the database server

  • deploying a Vagrant Windows 2012 box that is somewhat preconfigured
  • install the Chef client for windows
  • start a Chef provision run
  • installing a telnet client
  • installing .Net 3.5
  • getting the database creation and configuration scripts from git
  • installing Oracle 11g
  • creating an Oracle database with several tablespaces
  • creating an Oracle Listener
  • creating Oracle schema’s (about 7)
  • create database objects in Oracle (around 500 tables and 80000 lines of PL/SQL code and many more objects)
  • populate the entire database with data
  • refresh all indexes and snapshots and statistics

Step 2: Deploying the web server

  • deploying a Vagrant Windows 2012 box that is somewhat preconfigured
  • install the Chef client for windows
  • start a Chef provision run - installing a telnet client
  • installing .Net 3.5
  • installing the IIS web server Role for Application development with ISAPI extensions, ASP.NET and such
  • mapping a drive to a data volume
  • installing an oracle client using a silent install with answer file
  • configuring the oracle client by placing configuration files (tnsnames.ora and such)
  • do a git clone of the custom asp code that runs on the server
  • open the local firewall for some web ports
  • set the timezone to CET
  • disabling NLA for rdp
  • configure IIS with a custom recipe : - creating and configuring an app pool for the application
  • create a site to host the code
  • configure the site with some debug attributes for development

Step 3: Running integration tests

  • running Selenium tests which where recorded with firefox
  • each test does: - login test
  • use case test

Step 4: Automated deployment (when the integration tests succeed)

  • creates a deployment zip package
  • populates the zip with a file structure for the Acceptance environment and production environment
  • runs a nodejs javascript to edit the config files and set them to the correct values in each environment directory structure
  • updates the README in the main directory with instructions
  • the zip package that is ready to be deployed is uploaded to our configuration management system

DEVOPS tools of the trade

One of the important concepts that came to DEVOPS from the Extreme Programming guys was Continuous Integration. It means you build and assemble together daily all the software that has been developed for your product and runs tests if it all still works.

Diagram CD
CI workflow

When someone made an error or introduced a sneaky bug that breaks the product (we call it ‘regression’), it is immediately clear and this way it can be handled before shipping.
I really like this idea (as a system engineer) and I have introduced this concept to our software some years ago. Although many tools for build automation and integration exist let me first mention the (real) big ones. I will explain which one we used in a new post.

Jenkins

A server deployed in a Tomcat servlet container for running automated builds and the plugins and workflow structure allow you to connect to Configuration management tools, testservers, name it.. You can hook it up to your desired versioning system like git or svn. For instance, you can trigger a build when a developer commits new code and build the source code and run tests afterwards (or deploy to some environment). This has been around for some time and their plugin system is quite rich. You can integrate anything with Jenkins.

Teamcity

Created by Jetbrains who also make beautiful IDE’s (IntelliJ, Webstorm). Arguably the same features as Jenkins but it seems to be more Microsoft friendly.

Bamboo

The CI system by Atlassian also delivers features for building and testing of software in a workflow. The strongpoint of this solution is that it ties in neatly with other Atlassian products like Jira and Confluence. It is based on Open Source tooling.

Team foundation Server (TFS)

Microsoft’s entry is this market. It can handle not just building and testing alone but it offers release management and projectmanagement capabilities as well. It combines nicely with all Microsoft server products (of course). I used to vote against it because I don’t like the idea of vendor lock-in, but with the new wind blowing inside Microsoft introduced by their CEO Nadella and their embrace of Open Source.. Could work in a heterogeneous environment.

Agile operations == DEVOPS

It has been around for some years but when I read about this concept last year I immediately felt comfy: DEVOPS. It’s actually two words and it means the marriage between IT development and IT operations.

Screenshot of windows error
Operations problem

When these departments or people do not communicatie, bad things happen. I have seen this in person and felt it in person. I remember nightly visits to a datacenter in Rotterdam to reboot Unix boxes in the middle of the night (remember Girotel Online dutchies). What actually happens a lot is the existence of a disconnect between what is traditionally considered development work and what is traditionally considered operations work.

To smoothen the bond between DEV and OPS, for me it seems logical, as I started my career in IT systems administration and 15 years ago made a switch to the developer ‘camp’. Since those days I have never understood the difference between development and administration departments within big enterprises. I have worked in both worlds. In my opinion we all try to reach the same end-goal, a happy customer.

Oracle External tables over the network

At one of the projects we do for KPN we use Oracle as a database. For some daily batches we use the external tables mechanism to load data from files we receive from another supplier in the chain.

The files are in the comma separated values (csv) format like below (I used the Oracle example):

56november, 15, 1980 baker mary alice 09/01/2004 87december, 20, 1970 roper lisa marie 01/01/1999

Oracle can represent this file as a table internally and you can subject it to SQL-queries this way. A special driver is used to acces these files and originally could only acces a local filesystem on the server. At the client we used a volume that was mounted from a SAN through fiber channel.Unfortunately we rain into a problem when we wanted to use this mechanism on our production servers. I could reproduce the problem on our development servers using iSCSI (we don’t have fiber channel), that is roughly the same mechanism used by the fiber channel driver.

I created an iSCSI target and a volume and mounted it on v:\mount.

Screen Shot 2015-11-20 at 12.12.12

Screen Shot 2015-11-20 at 12.12.35

As a dba create this directory in sqlplus/your IDE to point to a directory on an iSCSI drive:

CREATE DIRECTORY ext_tab_dir AS ‘v:\test’;
GRANT READ,WRITE ON DIRECTORY ext_tab_dir TO SCOTT;

As user SCOTT we create this table:

CREATE TABLE emp_load 2 (employee_number CHAR(5), 3 employee_dob CHAR(20), 4 employee_last_name CHAR(20), 5 employee_first_name CHAR(15), 6 employee_middle_name CHAR(15), 7 employee_hire_date DATE) 8 ORGANIZATION EXTERNAL 9 (TYPE ORACLE_LOADER 10 DEFAULT DIRECTORY ext_tab_dir 11 ACCESS PARAMETERS 12 (RECORDS DELIMITED BY NEWLINE 13 FIELDS (employee_number CHAR(2), 14 employee_dob CHAR(20), 15 employee_last_name CHAR(18), 16 employee_first_name CHAR(11), 17 employee_middle_name CHAR(11), 18 employee_hire_date CHAR(10) date_format DATE mask “mm/dd/yyyy” 19 ) 20 ) 21 LOCATION (‘emp_load.dat’) 22 );

The following error will occur if you try to view data in this table as user SCOTT , for example with this SQL:select * from emp_load;ORA-29913: error in executing ODCIEXTTABLEOPEN callout

ORA-29400: data cartridge error

KUP-04027: file name check failed: V:\test\EMP_LOAD_1488_1176.log

Now to prove that it is the external tables driver that is not working, I will show you that the utl_file API mechanism (another driver to access files through pls/sql) DOES work:

Try this anoymous pl/sql code

declare l_file UTL_FILE.file_type; l_location VARCHAR2(100) := ‘EXT_TAB_DIR’; l_text VARCHAR2(32767); BEGIN – Open file. l_file := utl_file.fopen(l_location, ‘emp_load.dat’, ‘r’, 32767); – Read and output first line. utl_file.get_line(l_file, l_text, 32767); –header record dbms_output.put_line(l_text); utl_file.fclose(l_file); END;

It will print:

56november, 15, 1980 baker mary alice 09/01/2004

So how to make external tables work in this situation? Use a network share pointing to the same location as the fiber channel/ SCSI drive with a mapped network drive (let’s call it this share test). First map the network location to the drive on the windows command prompt of the Oracle server:

net use l: \192.168.50\test

Now change the Oracle directory object to point to the network share containing the file:

CREATE DIRECTORY ext_tab_dir AS ‘\192.168.50.5\tmp\’;
GRANT READ,WRITE ON DIRECTORY ext_tab_dir TO SCOTT

Agile methods and technologies for KPN HR-Analytics

This is a translation of the first part of the introduction of my thesis:

Frontpage of thesis
The need for relevant information to distinguish yourselves from others, making you faster than your competitor, has existed since the creation of the first societies. In the course of recorded history more and more data accumulated in different administrations. Since the advent of the computer and the use of databases, there are all kinds of new possibilities to analyze this data and transform it into useful information. Soon the management of large organizations (government and industry) acknowledged the usefulness of these analyses.

The systems and applications for this analysis were called Management Information Systems (MIS). In the last decade, features were added to these systems, which enabled aggregation and different ways to present information. The data is collected in a so-called data warehouse.
Since the expansion of functionality, these systems are known as Business Intelligence (BI) systems. They are indispensable for organizations in business and government to support the management in strategic and tactical issues.

Yet there is growing amount of data stored within organizations and with a tremendous pace. This is due to an ever-increasing external environment in which organizations (are able to) operate. (Aldrich & Mindlin, 1978). Think of blurring boundaries, faster communication, in short: globalization. The processing of this data stream will have to take place more rapidly in order to maintain the competitive advantage (Choo, 1995).

The design and construction of these systems requires usually a lot of time and money. The potential revenue when introducing such systems are high but research shows that there is large gap between investing in a better BI environment and reaping the benefits of it (Williams and Williams, 2007). Unfortunately half to one third of the data warehouse development projects fail. (Hayen et al., 2007) One of the reasons is the complexity of the systems and the data to be processed. There are various techniques involved and the requirements are often unclear. A possible consequence of this failure is that companies have fewer good reporting capabilities. They lack insight into their own activities and are less successful.

The fact that many BI projects fail is very unfortunate because in recent years much progress has been made in the mainstream software development industry through the use of agile development methodologies. These emphasize the continuous delivery of new functionality, intensive communication between customers and the development team and a total focus on quality. The Agile principles can also be applied to BI development projects but in the real world this is not common yet.
Several authors have written on this subject and argue that it is a misconception that agile principles are not applicable to BI (K.Collier, 2011). The best practices that exist and grew out of these principles can be used in modified form.
This thesis examines whether this also applies to KPN’s HR BI environment.

Back to life

A month ago or so I turned on my iMac and something weird happened. The screen stayed black as oil. Reset it and it did actually start to boot, but after a couple of minutes….. bam.. black screen. I could hear the fans still working but the machine died. Did some resets of NVRAM and stuff like that and I was able to boot to the console. Unfortunately, after examining the logfiles I found kernel panics loading the nvidia driver. This pointed to graphics card problems. After some research on the web I found that my iMac’s gpu (Nvidia 8800GS) is known for overheating problems. Must say I never experienced this before, but apparently all of those cards eventually succumb.

After calling Apple support they let me send the iMac to the nearby store for repair. A week later the dreaded answer: ‘ We do not supply this card anymore, so we cannot fix your Mac’.

This was quite a blow, since I really love this hardware. They even wanted me to pay for examining the iMac, but fortunately a Apple Support manager offered reimbursement by Apple.

So what then? I found a company on the internet which made repairing chipsets it’s business. Still, quite a gamble, as it is still about 230 euro’s. But it was my last change to get the iMac working again. Remember, the form factor of this card is so specialised, it is along with an obscure ATI card the only one that will fit into the casing of an iMac attached to the the logic board.

Screen removedScreen removed

Bezel removedBezel removed

LCD offLCD off

Naked imacNaked imac

Logics board with video card attachedLogic board + videocard

The defect cardThe defect card

- - - - - -



















So I had to disassemble the entire iMac to get the graphics board out of it. This was quite an endeavour, because Apple used some special Torque screws and clever engineering. But I succeeded and created a package with the videocard to be sent to the UK.

After a week or so it returned, supposedly fitted with a brand new GPU (graphics processing unit). A bit nervous and hopeful, I reassembled the iMac.

Reassemble!Reassemble!

Booting...Booting…

My old desktopMy old desktop!

It works!! Thanks Haytek LTD!!

Go supersecure

At Cloud Seven we endorse the super secure G/On product. It replaces the VPN technology that is commonly used to deliver remote access to the company network to employees. The easy part is you just plug in the USB stick and boot your computer and login, choose your application and it starts automatically and connects to your company’s servers and networks!

gon stick
G/ON USB stick

This product was acquired by Excitor in 2012, the creators of DME and integrated G/On in their mobile app as the Appbox feature. (Mobile Secure Email synchronization and remote Secure Apps).
It was recently selected to secure English councils in the U.K.

It is very easy to implement, uses industry grade encryption for connections and links up to your Active Directory or LDAP server for authentication.

Developing using Webstorm+Docker+nodejs on OS X and debugging it

First install the latest Docker(I use the Mac version for this tutorial) here: http://docs.docker.com/mac/step_one/

Get Webstorm: https://www.jetbrains.com/webstorm/
Start the Docker Quick start terminal (see screenshot) You’ll start the default docker vm (linux inside OS X).You can find this later by issuing ‘docker-machine ls’:

![] docker-machine ls

Go and download an image inside the docker terminal:Use ‘docker pull ubuntu’

docker pull  ubuntu

To run an interactive shell in the Ubuntu image: use ‘docker run -i -t ubuntu /bin/bash’

docker run -i -t ubuntu /bin/bash

Of course we also want to mount our home directory: e.g. use ‘docker run -it -v $HOME:/mnt ubuntu’ Inside the docker host you can find out that your mount is there: ‘ls /mnt’

ls /mnt

Go and create a folder on the host system that is going to have the app that you want to install for example /Users/thajoust/dev.Create a package.json file by issuing a : ‘npm init’ on the host system in $HOME/dev or create a empty one with: ‘touch $HOME/dev/package.json’ on the host system shell.

touch $HOME/dev/package.json Now if you go to the docker terminal and issue the command ‘ls /mnt/dev/’ you will see the package.son file ls /mnt/dev/ You can exit the container by issuing ‘exit’ in the container shell. You now need to run the docker image from the docker terminal to mount the new directory directly : ‘docker run -it -v $HOME/dev:/testApp -w /testApp ubuntu’

Now you will be in the container and you will be able to see the package.son. You can safely delete it now.docker run -it -v $HOME/dev:/testApp -w /testApp ubuntu

Next up is installing node in the container: First update the package state: ‘apt-get update’ then: ‘apt-get install nodejs’

apt-get install nodejs

Install npm: ‘apt-get install npm’ ![Screen Shot 2015-09-16 at 14.25.19]

apt-get install npm

Install Node locally: ‘npm install n -g’

npm install n -g

First install wget:‘apt-get install wget’

apt-get install wgetthen:Install latest nodejs:‘n latest’n latest

Install express generator:‘npm install -g express-generator’

npm install -g express-generator

Inside your testApp folder you can create an express App:‘express my app’

express myapp

Now we have to introduce a quick workaround for the problem that on some platforms there will be problems with symlinks,.
echo “bin-links=false” >>$HOME/.npmrc

There is now a folder with the name my app, let’s continue:‘cd myapp’ followed by ‘npm install’ (will install all the modules required for running your app) Then it is time to start the app:‘npm start’ Now go to webstorm and open the folder my app

npm startYou will see all the files.Configure the Run/Debug Configuration:Create a Node.js remote Debug using the + on left side cornerHost: 192.168.99.100 Port: 5858Create a Node.js remote DebugExit the docker container and commit it:‘exit’‘docker ps -l’ docker ps -l‘docker commit -m “message” containerId [newNameForTheImage]:[tag]’docker commit -m “message” containerId [newNameForTheImage]:[tag]Now we need to be able to open the ports that the node app is listening (both the app and the debugger):‘docker run -it -v $HOME/dev:/testApp -w /testApp/myapp -p 3000:3000 -p 5858:5858
ubuntu:version1’ docker run -it -v $HOME/dev:/testApp -w /testApp/myapp  -p 3000:3000 -p 5858:5858 ubuntu:version1Run node express app in debug mode:‘node –debug ./bin/www’node –debug ./bin/www

If you go to Webstorm and run the app with your Debug Configuration you will see that all is working.Below is the breakpoint reached when i set it and go to http://192.168.99.100:3000

breakpoint reached