IS at DrupalCon – Hola Barcelona!

Driesnote keynote talk at DrupalCon Barcelona 2015It’s 7.30 in the morning in our hotel and I have already overloaded my breakfast plate with so much that I need help to finish what I can’t eat.  It must be DrupalCon!

This week myself and some colleagues in IS have escaped from autumnal Edinburgh to balmy Barcelona for this year’s European Drupal conference, DrupalCon 2015.  For three days we are in the midst of a whirlwind of sessions on all aspects of working with Drupal, many of which touch on issues and experiences that are common to all web developers and site owners.

Our journey to create a new Drupal CMS for the University’s main website began before DrupalCon in Prague in 2013. Since last year’s DrupalCon in Amsterdam our new Drupal CMS has moved from its embryonic state following the initial MVP release in 2014 to a production system with a name, EdWeb, and upwards of 140 sites have so far been migrated across to the new system.  You can read more about our Drupal journey at the University Website Programme site in EdWeb.

Soon we embark on the exciting process of planning the next set of features to add to our shiny new responsive website and I’m sure that as before we will find much to inspire us at DrupalCon.  It’s also a fantastic opportunity to explore how other organisations are managing scalability and performance when running an Enterprise level CMS.  Judging by the number of cloud hosting companies in the exhibition hall this year, the answer to that problem for many organisations is to let someone else feel at least some of the pain for you!

As we did last year, over the next few days we will be gathering together short summaries of some of the sessions we attend here at DrupalCon, with our own reflections on what we see and hear. This year we are fortunate enough to have brought a group of colleagues who have played a range of roles in the creation of EdWeb, from development through to project management and production staff and there is something at DrupalCon for everyone!

But before we share any session notes, the Prenote and Keynote sessions from the first day of DrupalCon have already given food for thought.

At the centre of the yesterday’s 8am Prenote, which is always an entertaining way to start DrupalCon, was the notion of dreaming the impossible dream, expressed charmingly in song by Adam Juran, in this case the seemingly impossible dream of getting Drupal 8 released.  Having been involved with the huge undertaking of building EdWeb from the outset, that sentiment was very familiar!  Throughout the ups and downs of such a large scale and complex Agile project, it’s been important to keep our end goal in sight, and to believe that what we are trying to achieve with EdWeb is both possible and necessary.  Now we have our production CMS, and it was announced at DrupalCon today that Release Candidate 1 for Drupal 8 will ship on 7th October 2015.  Those impossible dreams can be realised!

The morning continued with the opening Keynote by Drupal founder Dries Buytaert, and once again we see parallels between the process of getting Drupal 8 to a shippable release and our own experience of building a large scale CMS.  This year, Dries’ theme was “We need to talk about that”, covering some of the uncomfortable questions in the Drupal community.  Two aspects of his talk in particular struck a chord as they relate to problems that we have also had to solve.

In talking about the extended timescale that’s been necessary to get Drupal 8 ready, Dries proposed an alternative model for Drupal’s code management, a branching strategy rather than the current approach of having all development on the trunk.  This would embrace the difficult reality of getting all features ready for a release; instead, what goes into the release is only what is ready.  During the course of our own CMS development, we have had to solve exactly that problem, releasing feature bundles into the production system at the end of each development iteration without releasing code that is not ready to ship.  Our initial workflow was to do development on the trunk, but we quickly realised that this creates problems, particularly as we run our migration project in parallel with ongoing CMS development work.  We also had to allow for the release of a security patch for Drupal itself, or for a contrib module we are using, which would need to take precedence over any feature development and go into production sooner, without including features that are not ready.  To solve those problems in a way that would support our automated deployment process, we moved to a workflow that turned out to be a variation on Gitflow Workflow and this has served us extremely well, allowing us to manage parallel development work and release only code that is production-ready into our live system.  It was very interesting to hear how Dries has come to the same conclusion as us – that for large-scale development work involving multiple developers and many features with a complex life cycle, feature branches are the way to go.  The detail of our own approach is a topic for a future post!

Another theme of Dries’ talk was usability in Drupal and how features that improve the experience for editors can be sacrificed in favour of features that add new functionality.  In creating EdWeb, we have involved users from the outset, whether via quick paper prototyping sessions to determine the best approach for a particular interface detail, or by running sessions where all members of our team, including developers, were able to watch editors actually use EdWeb so we could pinpoint usability problems.  That process has contributed hugely to the usability of EdWeb, but it’s undoubtedly true that when the pressure is on to develop new features, it’s very difficult to hold to the discipline of prioritising usability.  That problem is not unique to Drupal development; it’s a perennial problem that is particularly troublesome for Agile projects.

So there it is – before we even got to 10am on the first morning of DrupalCon, there was already a lot to think about!  Watch this space for daily posts with session notes from our team.  And if you want to see what we’re so excited about, check out the DrupalCon YouTube channel for session recordings!

Adding embedded Tomcat AJP support to a Spring Boot application

We currently use Apache with mod_jk in front of our Tomcat application servers. I was exploring how to use an embedded Tomcat while enabling an AJP connector. I wanted all the configuration to be property driven, allow the specification of HTTP/AJP ports, and allow the switching off of AJP for running the app locally.

Here’s how I went about it.

Application class

Firstly in the Spring Boot Application class you can tell the application on startup to use custom settings for the embedded Tomcat. Out of the box if you specify server.port as a property it will change the port of the standard http connector. I specified that property and some other values specific to AJP.

server.port=8082
tomcat.ajp.port=9090
tomcat.ajp.remoteauthentication=false
tomcat.ajp.enabled=true

They were then wired into the Application class using @Value annotations. The server.port is already handled by Spring Boot so I don’t have to do anything.

@Value("${tomcat.ajp.port}")
int ajpPort;

@Value("${tomcat.ajp.remoteauthentication}")
String remoteAuthentication;

@Value("${tomcat.ajp.enabled}")
boolean tomcatAjpEnabled;

Then I added in a specific Bean which defines the Tomcat settings, and whether or not to switch on AJP based on a property being set.

@Bean
public EmbeddedServletContainerFactory servletContainer() {

    TomcatEmbeddedServletContainerFactory tomcat = new TomcatEmbeddedServletContainerFactory();
    if (tomcatAjpEnabled)
    {
        Connector ajpConnector = new Connector("AJP/1.3");
        ajpConnector.setProtocol("AJP/1.3");
        ajpConnector.setPort(ajpPort);
        ajpConnector.setSecure(false);
        ajpConnector.setAllowTrace(false);
        ajpConnector.setScheme("http");
        tomcat.addAdditionalTomcatConnectors(ajpConnector);
    }

    return tomcat;
}

Then when I start up the application, I end up with an HTTP connector running on a specific port, and also optionally an AJP connector running on a specific port.

2015-06-24 08:40:09.514 INFO 93685 --- [ main] org.apache.coyote.ajp.AjpNioProtocol : Initializing ProtocolHandler ["ajp-nio-9090"]
2015-06-24 08:40:09.516 INFO 93685 --- [ main] org.apache.coyote.ajp.AjpNioProtocol : Starting ProtocolHandler ["ajp-nio-9090"]
2015-06-24 08:40:09.521 INFO 93685 --- [ main] s.b.c.e.t.TomcatEmbeddedServletContainer : Tomcat started on port(s): 8082 (http) 9090 (http)
2015-06-24 08:40:09.523 INFO 93685 --- [ main] uk.ac.ed.ca.centralauthms.Application : Started Application in 4.178 seconds (JVM running for 4.6)

 

Scotland JS 2015 – Day 1

ScotlandJS2

Recently I attended the Scotland JS conference, which I have to say was really inspirational. A big thank you to the organisers for making this happen! The conference was an amazing mixture of the practical and thought-provoking. I came away with a lot of ideas on improving my own work.

For those who weren’t able to attend I’ll be posting my notes from the talks. You can also find out more about these from the conference website (linked above) and the Scotland JS Twitter feed.

Continue reading “Scotland JS 2015 – Day 1”

Documenting Spring Boot Microservices with Swagger

It’s a fairly well known fact that many developers don’t like to write documentation, often muttering things like “the code is the documentation” in a half hearted manner suggesting even they don’t believe themselves. Recently I was looking to write a Microservice, so I wanted to also look at ways in which we could make nice easy to use documentation in a consistent manner.

Swagger

This led me to look at Swagger. It’s a way to produce elegant and powerful interactive documentation on your REST API without having to write pages of documentation. And when used with annotations and Spring Boot, truly the code *is* the documentation.

Spring Boot Example

I’m completely taken with Spring Boot already. It takes a lot of the complexity out of getting an initial application up and running, and allows you to add in features easily. In my application, I produced a REST API for getting identities out of LDAP, to support both a lookup on the logged in persons identity, and also other identities.

The POM file

We still use Maven (although it’s getting more and more tempting to switch to Gradle). My pom file contains the following dependencies:

<dependencies>
    <dependency>
        <groupId>org.springframework.boot</groupId>
        <artifactId>spring-boot-starter-web</artifactId>
    </dependency>    
    <dependency>
        <groupId>org.springframework.boot</groupId>
        <artifactId>spring-boot-starter-data-rest</artifactId>
    </dependency>
    <dependency>
        <groupId>org.springframework.boot</groupId>
        <artifactId>spring-boot-starter-actuator</artifactId>
    </dependency> 
    <dependency>
        <groupId>org.springframework.boot</groupId>
        <artifactId>spring-boot-starter-test</artifactId>
        <scope>test</scope>
    </dependency>
    <dependency>
        <groupId>com.jayway.jsonpath</groupId>
        <artifactId>json-path</artifactId>
        <scope>test</scope>
    </dependency>
    <dependency>
        <groupId>org.springframework.ldap</groupId>
        <artifactId>spring-ldap</artifactId>
        <version>1.2.1</version>
        <type>jar</type>
    </dependency>
    <dependency>
        <groupId>com.mangofactory</groupId>
        <artifactId>swagger-springmvc</artifactId>
        <version>1.0.2</version>
        <type>jar</type>
    </dependency>
</dependencies>

Most of the dependencies are covering the Spring Boot side of things, REST support, web application and LDAP/JSON support. Note though the bottom dependency, which is including support for Swagger Spring MVC.

The Controller

The controller is fairly simple, it maps two URLs. The key thing of note is the @ApiOperation annotation, where we describe what the method does.

@RestController
public class UserLookupController {
    
    @Autowired
    LdapService ldapService;
    
    @ApiOperation(value="Get the currently logged in users details",notes="Uses the remote user logged in")
    @RequestMapping(value="/my",method=RequestMethod.GET)
    public @ResponseBody Person getMyDetails(HttpServletRequest request) throws ServletException
    {
        if (request.getRemoteUser()==null)
        {
            throw new ServletException("Remote user is null.");
        }
        return ldapService.getPerson(request.getRemoteUser());
    }

    @ApiOperation(value="Get a specific users details",notes="Requires uid of user to look up")
    @RequestMapping(value="/id/{uid}",method=RequestMethod.GET)
    public @ResponseBody Person getUserDetails(@PathVariable("uid") String uid)
    {
        return ldapService.getPerson(uid);
    }
    
}

Swagger Config

We also need a config class to tell Swagger what to do, and also provide high level documentation about what the Service is providing.

@Configuration
@EnableSwagger
@EnableAutoConfiguration
public class SwaggerConfig {
    
    private SpringSwaggerConfig springSwaggerConfig;
 
    @Autowired
    public void setSpringSwaggerConfig(SpringSwaggerConfig springSwaggerConfig) {
        this.springSwaggerConfig = springSwaggerConfig;
    }
    
    @Bean
    public SwaggerSpringMvcPlugin customImplementation() {
        return new SwaggerSpringMvcPlugin(this.springSwaggerConfig)
                //Root level documentation
                .apiInfo(new ApiInfo(
                        "Central Authorisation Service JSON API",
                        "This service provides a JSON representation of the LDAP identity data held in the Central Authorisation Service",
                        null,
                        null,
                        null,
                        null
                ))
                .useDefaultResponseMessages(false)
                //Map the specific URL patterns into Swagger
                .includePatterns("/id/.*","/my");
    }
    
}

The Application Class

The Application class is simple, and just hooks into our LDAP config:

@Configuration
@ComponentScan("uk.ac.ed.ca")
@EnableAutoConfiguration
public class Application {
    
    public static void main(String[] args)
    {
        SpringApplication.run(Application.class, args);
    }
    
    @Bean
    @ConfigurationProperties(prefix="ldap.contextSource")
    public LdapContextSource contextSource() {
        LdapContextSource contextSource = new LdapContextSource();
        return contextSource;
    }

    @Bean
    public LdapTemplate ldapTemplate(ContextSource contextSource) {
        return new LdapTemplate(contextSource);
    }
    
}


HTML UI

Finally, we add in a static set of HTML/CSS/JS to cover the API. You can get the static pages from https://github.com/swagger-api/swagger-ui. Put them in your project under (src/main/)resources/static and they’ll automatically get mapped into the application.

The end result

Swagger UI screenshot

The end result is a service which also provides the following documentation. As the documentation is interactive you can also try to call the id service and see what kind of response it gives. Very neat, very powerful, and a very easy way for us to provide API documentation!

ETag and JSON data

Recently as part of an update project for our university portal MyEd (which runs on uPortal) there was an emphasis on moving our content to more client driven access to data. We wanted to separate out the data and presentation a bit more, and also cut down on the load and traffic which a big single server-side render would produce.

We wanted to use JSON as the data format as it is nice and lightweight, and easy to parse with existing Javascript libraries (like JQuery). We then wrote in static URLs into the uPortal portlets which would allow the currently authenticated user (and them alone) to access their own data.

Our portal is under a reasonably heavy concurrent load at any given time, so we wanted to explore caching of data to make sure we make any client side calls perform well under load.

Cache Headers versus ETag

Cache Headers are used to tell a browser to not re-request an object from the server until a certain time, typically by setting an expiry date. This avoids any traffic going to the server at all, which reduces load but can mean that changes to data are missed because the cache expiry date has not been reached.

ETagging is different, in that an ETag value is set in the header, for example:

ETag: "asb227873hva23456n"

When the browser re-requests data from the url it passes the ETag back to the server in an If-None-Match header, e.g:

If-None-Match: "asb227873hva23456n"

The server then uses the ETag to decide what to do, either to send an HTTP Status Code of 304 not modified (typically with a very short response), or refresh the data and return new information back to the client. This reduces the bandwidth required, but more importently allows the server to decide how and when to respond with fresh data.

In order to get the best performance, you would in most situations use both caching and ETag in order to limit high frequency client traffic to the server but also allow the server to mitigate load using the ETag. We found when using both that behaviour in our uPortal server alongside our load balancer led to unexpected results , so we opted to initially use ETagging only.

(As to why our load balancer was causing unexpected caching behaviour we’ll have to investigate later, and potentially write up another post in and of itself!)

Portlet modifications

So in the portlet itself (which is written in Java), we set the JSON data controller method to add in an ETag.

final String eTag = getETag(data);
final Date expiry = new Date(System.currentTimeMillis() + MAX_AGE_MILLIS);
        
session.setAttribute(SESSION_ATTR_CACHE_ETAG, eTag);
session.setAttribute(SESSION_ATTR_CACHE_EXPIRY, expiry);
response.setStatus(HttpServletResponse.SC_OK);
response.setHeader("Cache-Control", "must-revalidate");
response.setHeader("ETag", eTag);

Finally, we then added a check in the method for the ETag coming from the If-None-Match header:

final String ifNoneMatch = request.getHeader("If-None-Match");
 final String existingETag = (String)session.getAttribute(SESSION_ATTR_CACHE_ETAG);
 final Date existingExpiry = (Date)session.getAttribute(SESSION_ATTR_CACHE_EXPIRY);
 if (null != ifNoneMatch
 && null != existingETag
 && null != existingExpiry
 && ifNoneMatch.equals(existingETag)
 && System.currentTimeMillis() < existingExpiry.getTime())
 {
 response.setStatus(HttpServletResponse.SC_NOT_MODIFIED);
 return null;
 }

The above code checks the passed in ETag, compares it with the one stored in the user session, and additionally compares it with an expiry tag, then responds with a NOT MODIFIED 304 if the tags match and the expiry hasn’t passed. The response is null which means it doesn’t have to query the underlying dataset to respond, and therefore the response time and the bandwidth used are dramatically reduced.

jQuery UK – Morning Sessions

Last week I was lucky enough to attend jQuery UK, a conference focussed around front-end web development, the technology and tools behind it. Despite the title, jQuery UK isn’t focussed exclusively on jQuery. This year’s keynote speech from Mark Otto was specifically about CSS, and a couple of talks discussed practices which avoid using even JavaScript.

The conference mostly took place across two streams, which means I can only report on around half of the content based on the sessions I attended. I specifically tried to attend talks that could be relevant to the work we do in Apps, and so missed out on topics like game development and WebGL. When the videos for the remaining talks are uploaded I’ll scan through them as well and write a follow-up post.

I’ve split my talk descriptions into two posts: morning and afternoon. I should also note that I’m providing this write-up from the point of view of IS Apps, so some stories might not be as relevant to you as others. I’ve provided a tl;dr with each talk which will hopefully help suggest which you might want to read more details on.

Edit (24th March): Updated each talk with its video and slides where possible.

Continue reading “jQuery UK – Morning Sessions”

IT Futures – Morals, ethics, surveillance, security

The IT Futures conference raised a number of issues around data, who should be responsible for its safety and what can and/or should be collected. While most of the conference was talking about interesting pieces of research and investigation, a few bits were of relevance to SSP.

The ethics of detail analysis

There are a number of measures in place for making use of student data, for example identifying when a student is experiencing difficulties before an assessment. An early warning system of sorts. However, there are certain data that could be gathered for this which are not. IP address for location of log in, or cause of an absence are two such examples not collected.

The data that SSP uses revolves mainly around student data, so a lot of what has been discussed focused on how this can and should be used. The fact that the access to said data is prevalent in the team means that decisions have to be made on what will and will not be used.

Who is responsibility is it that your data is safe?

It was said in the conference that the University has experienced 12 moderate to severe data security incidents. The key-logger found back in November was one such incident, though it was broken into very easily as whoever used it forgot to change the default password. Universities do not like admitting vulnerabilities, though it was found that three other Institutions, including Birmingham, found key loggers. So this is not an isolated incident among Universities.

The location of data becomes very important for security reasons. The University has an agreement with Microsoft to use OneDrive for storing data ‘safely’. This then puts the responsibility of securely storing that data on Microsoft.

Thinking about where data is secured, either locally or off site with a contracted third party, it pays to think about how the data is secured.

On a side note, it was mentioned that Office 365 has an ability to remotely wipe devices. This can lead to unfortunate situations where a device could be wiped remotely when it shouldn’t be!

 

From the sublime to the ridiculous: Development tools in Dev Services

As a ColdFusion developer, my own journey to finding the perfect development environment has been, I suspect, fairly typical. I cut my teeth with Dreamweaver. I progressed to ColdFusion Builder 1, then 2. I toyed with Notepad++. I gave Eclipse a whirl. Most recently, I’ve been developing almost exclusively in Sublime Text 2.

I feel that my switch to Sublime Text has increased my productivity, so naturally I was curious about what others in the team were using, and whether it would be worthwhile purchasing licenses for the team(s). To find out more, I asked colleagues within Development Services to participate in a survey about their development environment preferences.

21 people were kind enough to take the time to respond, here’s what I found:

Q1. What languages do you work with?

Question 1: What languages do you work with?
Question 1: What languages do you work with?

With skills across a range of software platforms, we’re not a one-tech-shop, so to get some context I had to ask respondents about what languages they were developing in.

It was interesting to note that we have more developers using Java than ColdFusion, despite ColdFusion being our primary development platform.In hindsight, perhaps I should have framed the question to include a weighting on time spent on each language.

There were no surprises that SQL is ubiquitous and JavaScript usage is widespread.

The 3 “Other” responses were: Bash, Unix shell scripting and C#.

Q2. How do you run your local development?

Question 2: How do you run your local development?
Question 2: How do you run your local development?

I thought that whilst I had people completing the survey, I might as well try to find out some additional things about how they worked, like their approach to local development.

The ‘Other’ response was:

for SITS we have to develop on the client but use a lot of local development sometimes with VMs etc

I was surprised by the number of respondents who use locally installed server software, personally I have found the use of virtual  machines to have huge advantages in simulating infrastructure, and assisting collaboration within projects.

From the two main virtualisation options, VMWare player has the edge over VirtualBox.

Q3. What do you currently use as your primary development tool?

Question 3: What do you currently use as your primary development tool?
Question 3: What do you currently use as your primary development tool?

This is the question I was really interested in: What IDEs or editors are being used for development?

Unfortunately I only allowed respondents to choose one answer. Some obviously couldn’t decide and so choose ‘Something else’ and put multiple tools:

  • Spring Tool Suite/Eclipse/Webstorm
  • I switch pretty evenly between eclipse & netbeans
  • PSPad
  • SQL Developer
  • Oracle Developer but also use Notepad++
  • SublimeText (evaluation period), ColdFusion Builder 2, NetBeans
  • PHP Storm and Notepad+

It looks like Sublime Text has the edge, with Eclipse and NetBeans coming in close second.

Q4. How do you feel about the IDE/editor you chose?

Question 4: How do you feel about the IDE/editor you chose?
Question 4: How do you feel about the IDE/editor you chose?

I asked this question because I wanted to know if people are satisfied with what they’re using.

The results show that people seem mostly satisfied. Some respondents gave detailed feedback:

I’m pretty happy with both eclipse & netbeans, although both have their niggles. I don’t think there’s such a thing as the perfect IDE. At 7 below, I say I’d consider switching to SublimeText, but I’d need to evaluate it as I have never used it. I’m not sure how well it would work for Java development, and there is an eclipse plugin for Drupal development I use which I don’t think would be there for SublimeText.  [Answered ‘eclipse and netbeans’ in Q3]

It has lots of niggles and there is not a better alternative available in the University, there are better ones available though. [Answered ‘Oracle Developer but also use Notepad++’ in Q3]

CFBuilder does the IDE job, LOVE SublimeText does it all (or most of it – CF), Love NetBeans (Java), Like Eclipse (Java) [Answered ‘SublimeText (evaluation period), ColdFusion Builder 2, NetBeans‘ in Q3]

I don’t really like it, but haven’t made the effort to sort out a different one… [Answered ‘Eclipse‘ in Q3]

Q5. What features do you use?

Question 5: If you use an IDE, what features do you use?

I wanted to know more about what features our developers were looking for in their choice of IDE/editor.

The results show that only around half of the respondents use the features that IDEs provide, the other half are only using it as an editor, or not using an IDE at all.

I wondered why some people are finding value in the IDE features, and some are not. Perhaps there is a link with development platform? I correlated the results with the development platform data from Q1:

Correlation between development platform and usage of IDE features.
Correlation between development platform and usage of IDE features.

This shows that:

  • All but two of the Java developers use multiple IDE features.
  • No ColdFusion or PHP web application developers use IDE features unless they also develop in Java.
  • The developers working on 3rd party applications do not use IDE features except for one respondent who uses line debugging..

My own experience of using these IDE features within a ColdFusion / ColdFusion Builder context has been largely frustrating. Line debugging in particular would be a useful troubleshooting technique, but configuring it to work with my local development environment (i.e. ColdFusion running in VMs) is enormously difficult and can turn into a huge timesink. This is an area where collaboration may be useful to find some kind of solution that is workable.

Q6. Would you like a Sublime Text license?

Question 6: Would you consider switching to Sublime Text if a license was purchased?
Question 6: Would you consider switching to Sublime Text if a license was purchased?

Sublime Text is not a free product. Would more developers use it if the University provided licenses?

The results suggest that three quarters of those surveyed would be interested in getting a license for this product.

Summary

People are using a wide variety of tools to support development.

Most ColdFusion developers only require editor features, whereas most Java developers use multiple IDE features.

Many developers would like to use Sublime Text if we had a license for it.

I was not surprised at the positive response to Sublime Text. Personally I have found that it offers many features, such as multi-line editing, which are a huge boost to productivity,

Thanks to all who participated in the survey, I feel that the results demonstrate that there is a strong case for offering Sublime Text licenses for those who would find it useful.