Synergistic design solutions using a dry-wipe medium

This blog post is not brought to you by Simply Glass Wipeboards, nor do I receive any form of commission on purchases or compensation for writing it.

So a while back I got tired of throwing away pad after pad of paper drawing mockups and making notes. Iain swung by my desk recently and saw my solution: a desktop glass whiteboard, which he thought might be worth sharing with everyone.

Pictured below is an early design of this blog post I drew up on it. Honest.


It’s been great for to do lists: I can’t remove anything until it’s done, as it’s lost forever. It’s good for rough designs, you can quickly erase and try again – though granted not for a whole application (only being A3 sized). It’s bad for handing over designs to colleagues, as they try to steal it from you.

If of interest, this is it!

P.S. Eliminate all sharpie pens from near your desk if you do go for this.

Software Development Community of Practice

Imagine having access to a safe environment where you can ask all sorts of subject related questions. Or perhaps you prefer meeting people and talking about your and their experiences? What about situations where you really need some advice or a second opinion on a method or how to apply a standard in some way?

A community of practice does all of this but lots more


As well as deliver tangible results

Communities of practice are a great way of getting people to connect and talk about a common area of interest. Working here at Edinburgh University I have seen fantastic work going on in my own department and Division but also in the Schools and of course in other parts of Information Services.

From working with UCISA as Vice Chair for the Infrastructure Group I have gained connections and contacts who have been great sources of information, ideas, new ways of thinking and these have really translated into tangible results. This is exactly what a community of practice should be about.

Universities have a long history of inter-organisation collaboration and for an organisation that has the scale and diversity of our own we have a fantastic opportunity to make the most of our own rich sets of skills, experiences and specialisms.

In my current role I can see a fantastic opportunity to create a Community of Practice in the University that focuses on Software Development and all that it involves. This interest area is a big thing for a lot of my direct colleagues and I firmly believe that a community of practice would be a great vehicle for encouraging collaboration on so many levels. So that’s what we are going to do!

So look out for activity with the new Software Development Community of Practice

Why not sign up to the mailing list at

Or drop me a note at

Reflections on UCISA 16 conference

“Sue, that is surely the best conference I have ever attended”

This was what I said to Sue Fells who is the Business and Operations Manager at UCISA. Rarely have I come away from a conference and felt that something happened when I was there that will make me think completely differently. To attend a conference where each presentation leaves you asking questions is very very rare but that is exactly what happened.

When I was thinking and preparing about how I was going to write this blog I was wondering what pieces would be most interesting to staff in Apps Division, what could I take back that would encourage people to build on what we were doing and what could really motivate people to develop the initiatives we are heavily underway with?

So my planning for this started before I went, which sessions would I go to? Which vendors would I speak to? How could I use this time with the Director and other colleagues to really make the most out of the event? Lots of plans, lots of ideas, great I’m ready to go!

What I didn’t expect was that a lot of my preparation as it turns out would be replaced by my reaction to the experience!!!

I could go into length about the sessions I went to, I have many notes I can assure you. But in truth pictures paint more words than I should in a blog and so with that in mind I have linked two of the key presentations into this blog. I would encourage everyone who reads this to take a look at my highlight talks… Really it’s worth it.

First of All

California dreamin’ presented by Hilary Baker, Vice President for Information Technology and CIO at California State University.

Hilary’s presentation surrounded the approach she and her colleagues had taken to engage with their students (41548 students) at CSU, to encourage them to participate in the development of their own experience at the University, with the ultimate objective of increasing student graduation rates.

Hilary asked students how they could use technology to make it better for them as they tried to manage their degree process, navigate their experience at the University and also to prepare them for careers in the future. What a great package of objectives!

They set up a competition called AppJam where students were invited to create teams of students with skills from a range of specialist areas who would collaborate to develop ideas and mock-ups and prototypes for apps that could be incorporated into the institutional App called CSUN mobile. The students would be asked to present their ideas and prototypes and a winner would be chosen which would actually become a real part of the App. I guess they got 24 teams because they also had a rather nice cash reward into the bargain, but again lots of great real world experience there.

Have a look at the presentation and learn how this all went and why they are going to grow the idea going forward!

Really inspirational ideas and commitment from Hilary and her team.

A great phrase Hilary used really underlined the real success story of their project. “Students Can No Longer Escape Learning”


Creative Leadership by Jamie Anderson, Professor of Strategic management at Antwerp Management School and Visiting Professor at London Business School.

Now this is what you call a life changer. I defy anyone to see this presentation and to not be blown away by it! Really I think this is surely one of the most insightful presentation I have seen ever!

Jamie takes the audience on a journey of self-discovery, something that will leave you really thinking about many diverse things but specifically what we all need to do in order to be creative in our work.

I will not spoil the fantastic experience of following him on the journey and encourage you to take about 45 minutes to treat yourself!

If there was ever a presentation that would make you sit up and listen then his is it.

Please do take a look at this!

All of the presentations can be found here, and registration is a simple process of adding your email address.


I’m attending the UCISA CISG 2014 conference.  As this is not specifically about development, I’m using my own blog to post about the presentations (with some delay, as I’m not one of these people who can write a blog post during the presentation itself!).

UCISA is the University and Colleges Information Systems Association, which brings together IT people from across the UK Higher Education sector. CISG is their Corporate Information Systems Group, who run a conference every year.

SSP New Developments

The Student Systems Partnership (SSP) Team has recently developed and successfully applied to live the following three projects:

SAC018 UKBA Data Recording – Overseas students can study in the UK after being granted a visa under Tier 4 of the Home Office (formally the UK Border Agency) points based system. This main aim of this project was to improve the way University holds data for Tier 4 students by bringing information from disparate systems into EUCLID to allow effective reporting on, and monitoring of Tier 4 students.

Other improvements were made so that CAS (Confirmation of Acceptance of Studies) requests for students extending their studies are generated in EUCLD rather than keyed manually. Tier 4 data is now presented in one place within EUCLID for review by Registry staff at Census points. Copies of passport and visa documents can be stored and accessed from EUCLID.

The software has been successfully used during the last week (20th – 24th October) the Tier Four Census 2014-2015 academic session. The census details of around 5,500 students were collected and stored within the SITS database.

SAC019 Direct Admissions Review – The purpose of the project was to review Direct Admissions Processes across the university with a particular focus on PG and VS applicants looking at the process from the decision to submit an application to the point of decision. The new Direct Admissions application uses the same framework as the one developed as a part of the first SSP Agile project Paperless Admissions. From the technical side the following technologies or improvements have been added to the SITS:

PD4ML – is a powerful PDF generating tool that uses HTML and CSS (Cascading Style Sheets) as page layout and content definition format. The software has been installed on the SITS server and is used to generate the PDF version of the offer letter so it can be printed on the student request. The technology has been also successfully used in the Student Self Service to print documents such as Certificate of Matriculation, Higher Education Achievement Report (HEAR), Certificate of Student Status (for Council Tax Exemption)

E: Vision SSO Single Sign On enhancement – a MD5 encryption method of the single sign on links has been replaced with more secure AES Advanced Encryption Standard (32 character key)

SAC033 Tier 4 Engagement Monitoring – The aim of this project was to meet the requirement by UKVI to be able to report on engagement for all Tier 4 Students by September 2014. As a part of this project the following functionality have been delivered:

  • Exposure of engagement points from other sources within EUCLID
  • Bulk creation of engagement points within EUCLID
  • Auto-scheduling of engagement points per student within EUCLID
  • Facility for administrators / academics to record engagements within EUCLID
  • Upload of engagement points from spreadsheets into EUCLID from an external source

From the technological point of view the following solutions have been used:

  • Creation of the JSON object from the .csv file
  • Validation of the JSON objects using the JSON schema
  • Creation of csv files from the html tables

IS at DrupalCon Amsterdam – Day 3

This is the third in a short series of posts from DrupalCon 2014 in Amsterdam, where a few members of IS are spending this week. On Wednesday and Thursday I posted some session summaries from Day 1 and Day 2 of DrupalCon.

Yesterday was the final day of conference sessions and after the main auditorium session of Drupal Lightning Talks, the Drupal Coder vs Themer Smackdown perfectly illustrated one of the best things about DrupalCon, the element of fun that can pervade even the driest, most technical discussion.  The Smackdown was neither dry nor immensely technical, but Campbell Veretsi and Adam Juran managed to make some serious points about good Drupal development practices whilst wearing martial arts gear and waving weaponry around.  Watching their antics was a great way to wake up for a day of DrupalCon talks, and their battle to create a Drupal site from wireframes using either only code or only the theme in only 15 minutes showed how Coders and Themers are inherently dependent on each other and are better off hugging than fighting.

Our experiences of some of Thursday’s sessions are outlined below. Two sessions from Day 2 which were not written up in time to appear in yesterday’s post are included here. Once again, thanks to Aileen, Arthur, Riky, Tim, Andrew, Adrian and Stratos for contributing their thoughts on sessions they attended; any errors, omissions or misinterpretations in their edited summaries are entirely mine. Most of the sessions mentioned below, along with many more interesting talks, are recorded and available on the DrupalCon website, so if the summaries pique your interest, visit the DrupalCon site for more information!

Development Processes, Deployment and Infrastructure

How we Quantified the True Business Value of DevOps with Real-life Analysis

This talk did an excellent job of not only giving the general benefits of DevOps, but why it is good for the Business too. It focused on six phases for implementing DevOps, saying that it’s not about whether you are using DevOps or not, but more a case of how much.

  • Create your world – Use deployment and configuration management, and standardise across the board.
  • Monitor your world – Use effective automated monitoring with easy access to information and a clear notification and reaction process. This is not using grep in /var/log/; see the session on logging and monitoring tools.
  • Improve your world – Minimise repetition so that you can maximise the time spent on actual issues rather than environmental problems. Nobody should be logging onto servers.
  • Test your world – Use automated testing (you shouldn’t need to depend on a developer triggering them) with robust test strategies; this makes customers happy. We can start small, as any test is better than none.
  • Scale your world – Have automated responses to increased needs, with predictability, reliability and graceful degradation.
  • Secure your world – Use proactive and reactive strategies with intrusion detection and alerts.

We would need to build institutional confidence in our process and what we’re doing, but we can start small. This is firstly a culture change and then a process change, but without Business buy-in, the task is complicated and often doomed to failure. The biggest initial wins can be found with configuration management (Puppet), automated deployment (Bamboo, which we’re already using) and easy scaling (OpenStack, or perhaps AWS). By using a quantification framework we can evaluate the benefits in using DevOps processes, though they are not all immediately quantifiable; it’s best to start with easy, universally understood metrics.

Of all the sessions I have seen, this is a true must-watch for anyone that doubts DevOps is the future and contained so much useful information that my summary has barely scratched the surface. Watch it during your lunch; you can thank me afterwards.

GitHub Pull Request Builder for Drupal

Note that this session happened on DrupalCon Day 2.

This session described how Lullabot use GutHub pull requests to automatically build a Drupal instance to test changes.

The pull request includes the Jira ticket number and allows you to see review a list of all commits much as Bamboo currently does for us, but you also get a diff of the changes across the request.

For their automated deployment Lullabot use Jenkins, which listens out for pull requests to the development branch rather than commits. It then builds a dev environment from the dev branch, including the pull request patch, and when done posts a comment in the pull request with a link to the testing environment. The comment includes instructions for clearing down the test area when finished. This is achieved using a Jenkins plugin listening to an IRC channel; a message of “jdel 12345” will delete the test environment for pull request 12345.

When building the test environment, a recent copy of the live database is used so they can test against real data.

If there are any changes required and more commits are pushed to the pull request, Jenkins rebuilds the test environment and re-runs tests. Lullabot have found this very useful as it lets clients quickly see new features or enhancements without affecting other environments, especially where multiple features are being developed in parallel; each feature has its own test environment derived from that feature branch’s pull request.

Once the pull request is merged you can automatically deploy to another environment.  Alternatively, this can be left as a manual job and multiple pull requests included to build a release.

As part of this automated deployment process, Lullabot run automated tests using CasperJS and take screenshots with Resemble.js; the process then sends out a login link for testing using an admin user which exists purely within that test environment.

Lullabot are currently working on a service which supports this automated build of Drupal environments. Currently in private beta, it can be found at

Automated Performance Tracking

We should be considering how to measure performance better, not just in terms of which metrics to gather, but making sure that the measurements we do take are repeatable and relevant. This talk was mostly about trying to get the “core introspection” methods more widely used and extended since what is currently available is not very useful at the moment, which may not seem immediately relevant to the University, but there were some interesting points.

For instance, performance measuring should be a part of the project from the beginning. We need to see how performance changes over time – the best case would be over every commit. This would allow to evaluate changes in terms of performance – “Yes, sure you can have that feature but it will make your site run 10% slower”.

There are many different technical challenges with measuring performance.

  • Which metrics to take? Different sets will be useful for front end, back end, databases, and external services.
  • Which tools set to use? XHProf and webprofiler are the current most useful and can be used to collect data automatically via XHProf-Kit.
  • How do we automatically setup relevant “scenarios”? This could actually be the easiest task for us. We could import data from LIVE to Staging and then use Behat to run tests for all the user stories. We could even run them in parallel for realistic load testing.
  • Data MUST be collected over time to allow decisions to be made. The smaller the granularity the better, in general.

There are many tools available to help with databases, for example was mentioned. These could be used as part of the regular support upkeep. The data collected can then be fed back into both the decision making process and the development process.

Also we should keep the slow query log and use tools like pt-query-digest to make sure that things are not getting worse! The sooner we find a problem the better chance we have of figuring out what has caused it and therefore fixing it.

In order to keep the measurement relevant we need to make sure that the different environments are equivalent and that all infrastructure is identical; this is a common theme across many DrupalCon sessions this year.

Another problem with keeping the performance relevant is how to ensure that the performance is NOT measured on sites on virtual machines. The speaker discovered that the differences between runs was too great to make the measurements useful; in order to make these measurements comparable, they should be done on dedicated machines, not virtual ones. This could create problems when ensuring that the infrastructure is identical if we rely too heavily on methods that only work with VMs.

At least 6 stats need to be kept for each metric over many runs:

  1. Minimum value
  2. Maximum value
  3. Average
  4. Median
  5. 95 percentile
  6. 5 percentile

This is the only way to even out many of the non-code contributors to performance.

The new sensiolabs profiler was mentioned. In is currently in private beta but will be very fully featured. We’ll probably need to wait and see. It will be free to OSS projects so it will be easy to evaluate.

Building Modern Web Applications with Ember.js and Headless Drupal

Ember.js is a client-side javascript framework for building single-page applications using the MVC architectural pattern.  The presence of this session and similar sessions at this year’s DrupalCon reflects the fact that single-page applications are becoming the norm. For speaker Mikkel Høgh, this development is inevitable as the expectations of web users increase.  Constant page reloads are not efficient; it’s not just the request/response overhead that are an issue, but the repeated re-rendering of page content, CSS, etc. Ajax calls can help with this, but building an entire application using javascript, jQuery and ajax without a framework does not make for clean, maintainable code.  Ember.js, like Angular and Backbone, is a framework designed to address these issues, with a rich object model and automatically updating templates using Handlebars, a semantic templating tool similar to Twig.

This session outlined the main core of Ember.js, a full-stack MVC framework in the browser and demonstrated some key features such as:

  • adherence to the concept of “convention over configuration”, which means there is less boilerplate code and more auto-loading;
  • “ember-flavoured” Web Components, an intermediary measure designed to alleviate poor browser support for the Web Components standard, which is not yet complete;
  • the class-like Object Model, based on Ember.Object, which supports inheritance;
  • two-way bindings that allow templates to automatically update with data regardless of where the model is updated;
  • automatically updating ‘computed properties’;
  • the importance of Getters and Setters, which must be used to allow the appropriate events to fire and update all uses of the data;
  • Routing, which determines the structure of the web application by specifying the handlers for each URL;
  • naming conventions, the use of which allows the framework to make reasonable assumptions about what an application needs so that it is not necessary to define absolutely everything;
  • the Controller, Model and View in Ember.js;
  • the ability to rollback data changes in the model that are not saved, allowing for less messy handling of persistent state in the browser;
  • the ability to omit an explicit View implementation because Ember.js can make assumptions based on other application configuration to send a default view;
  • Ember-Data, the data-storage abstraction layer designed to simplify data management over a REST API using JSON
  • useful tools for working with Ember.js such as EMBER-CLI.

The primary focus of the session was Ember.js itself, but the session did turn to the question of why to use Drupal as a back-end for an Ember.js application.  The benefits raised were very similar to those mentioned in other DrupalCon talks on headless Drupal, such as:

  • authentication, permissions and user management;
  • an easy Admin UI
  • the availability of many modules to provide rich functionality, enabling the Ember.js application developer to focus on the core application.

It was really interesting to hear about an increasingly common approach to addressing the challenges faced by modern web developers. Single-page applications are not an area we have widely explored, but given their prevalence and the increasing richness of the javascript frameworks available, it’s important to have some awareness of this web development technique and this session certainly provided much food for thought.  In the context of the University’s new central Drupal CMS, headless Drupal is not something we intend to explore; however, it seems likely that there will in future be local headless Drupal installations in Schools and Units that receive feeds from the central CMS.

If you’re interested in reading more about ember.js, see these pages:

Front End Concerns

Integration of ElasticSearch in Drupal – the “New School” Search Engine

This session included a presentation and demo on ElasticSearch, a full-text search server and analytics engine based on Lucene with a REST-ful web interface and features also available through JSON API.  Several Drupal modules were mentioned that have been written to make ElasticSearch available in a Drupal site.

Some key points:

  • easy to install and configure with an easy-to-use interface;
  • very scalable and distributed in a configurable way;
  • replication is handled automatically;
  • it is all open source and since the main application is comparable to a database the hosting needs will be similar;
  • the system contains a method which allows for conflict resolution if multiple users enter the same document to different nodes;
  • the query system is more powerful and flexible than other “URL only” systems for creating the queries;
  • it can be used with many other modules including watchdog and views;
  • it can be used with an ElasticSearch views module to allow querying of indexes of documents that are not in Drupal.

Sites developed by WIT for other areas of the University currently use Solr where more powerful search features are required.  Following this session, they intend to try out a cloud-hosted elastic search service,, with one of their sites that currently use Solr.  This will allow comparison between ElasticSearch and Solr to determine whether it is a suitable alternative.  From the perspective of the University’s central website, it will certainly be interesting to explore further the details and understand how ElasticSearch could be useful. Watch this space!

Project Management and Best Practices

Drupal Lightning Talks

Thursday began with a series of short talks on various technical and non-technical topics.  Some, like the Coin Tools Lightning Talk, were technically of interest but not necessarily directly related to our own use of Drupal.

The Unforseen: A Healthy Attitude To Risk on Web Projects

Steve Parks talked about how management of Risk can be a major blocker in projects being successful, highlighting the need to accept that there is risk associated with any project and the fact that trust is of great importance in mitigating the impact of risk.


The talk on the Druphpet project and Puphpet showcased a Puppet-based Vagrant VM suitable for instant and unified configuration of Drupal environment.  The question of how to get a fresh, consistent local development environment running as quickly as possible for the University’s new central Drupal CMS is something that we are currently exploring.  Puphpet is certainly something we will look into!

Continuous Delivery as the Agile Successor

Michael Godeck’s talk was of particular interest given the adoption within IS of automated deployment tools to support our internal Agile methodology.  The subject is closely related to DrupalCon sessions on DevOps with common underlying principles such as the importance of communication across teams and shared ownership.

Godeck talked about how Agile was effective in changing software development because it has “just the right balance of abstraction and detail to take the software industry to a new plateau”. Improvements in quality & productivity are gained by using Agile tools seriously.  Agile was designed to address difficulties in responding appropriately to changing requirements throughout the project life-cycle.  It is successful in that regard, but the key is to be able to *deliver* the software.

Continuous Delivery practice has the goal of dealing with the delivery question in the way that Agile has dealt with management of risk.  The emphasis is on resolving the conflict between the need to deliver quickly and get fast feedback and the need to run complex test suites which can be slow.  Build Pipelines break the build up into stages, with the early stages allowing for quick feedback where it is most important, whilst the later build phases give time to probe and explore issues in more detail. Like Agile, Continuous Delivery only provides the best benefits by changing culture across both technical and non-technical teams.  The key point is that software delivery should not be a “technical silo”; it belongs to the whole team and with Continuous Delivery, the decision to deliver software becomes a business decision, not a technical one.

We are already using many of the techniques and building blocks that are part of Continuous Delivery. However, the principles of Continuous Delivery are worth exploring further to identify where we may streamline and improve our existing practices.

Lightning Talks 2

This session was a follow-up from the main auditorium Lightning Talks earlier in the day.  It comprised two separate short presentations.

Session 1: AbleOrganizer: Fundraising, Outreach and Activism in Drupal

In this talk Dr. Taher Ali (Assistant Professor of Computer Science/ IT director of Gulf University for Science & Technology (GUST)), presented on the challenges around convincing senior management to adopt Open Source applications. One of the major concerns was around the support and maintainance of Open Source solutions. However after presenting a convincing argument built around the community strengths and license costs, the University now run the majority of their systems using open source applications.

One of the main advantages that the University has found is the ease of integration of Open Source application with one another.


Finally, it was noted that by becoming a gold sponsor of this event was their way of feeding back into the community.

Session 2: eCommerce Usability – The small stuff that combined makes a big difference

Myles Davidson, CEO of iKOS, gave a rapid fire presentation of how small subtle changes can collectively make a huge difference to customers and their success.

Some examples are listed below.

  • When using forms – make things simple, don’t make your users think!
  • Know what your users want and develop the front end towards their needs.
  • Make it clear – don’t drive people away though ambiguous messages. Use help text to help not hinder.
  • Where possible use defaults – reduce double keying e.g. Deliver and invoice addresses.
  • Be careful with buttons – don’t break the user journey.
  • Search – do it properly, do it brilliantly or leave it alone. People will leave your site if search doesn’t work.
  • Site recommendations need to be realistic.
  • Analytics – the key is that you can’t manage things that you don’t measure and you can manage everything!

12 Best Practices from Wunderkraut

Note that this session happened on DrupalCon Day 2.

At last year’s DrupalCon I saw a presentation from Wunderroot which saw 45, yes 45, different presenters in 60 minutes. This year they have reduced that down to a mere 12. Each presenter covered a single best practice compressed into 5 minutes and not a second was wasted. There were actually only 11 but let’s not be pedantic.

  1. Risk – adopt a healthy attitude to risk. Trust, training and responsible planning are better than bureaucratic rules  to manage risk.
  2. Predicting the future – Impact Mapping in four words Why, who, how and what. More info on
  3. Custom Webpage Layouts – put everything on one page!12BestPractices1
  4. How to make complex things simple – your website should mirror your customers needs not your company! Keep content consistent and the user experience consistent.
  5. Balance theory and practice – using new tools is not only about technologies it is also about approaches.
  6. Managing Expectations – 70% of projects fail due to communication. Keep communicating the minor decisions and use the project steering group to align expectations with stakeholders. Transparency is king!
  7. If you can’t install it, it’s broken – make sure the workflows work and keep the configuration in code, and remove old code. Old code smells.
  8. Alignment – let customers come to the community. The community is a rich vibrant and colorful community, there’s no danger in encouraging your customer to become a part of the community.
  9. Learning an alien language in two years – structure the information and use technology, like Anki which uses space recognition. Remember it is a step by step process that takes some time – read, listen and talk to people.
  10. One size fits all – consider all the possibilities. Start with the smaller screens and prioritise the content. Content prioritisation requires good customer knowledge. After prioritisation the content can be re-engineered for the specific user journey. Lastly, this knowledge can be used to create a road map for content development.
  11. A different kind of bonus system – hugs equal money.12BestPractices2

Hardcore Drupal 8

Field API is Dead, Long Live Entity Field API!

With the beta release of Drupal 8 there are major changes to the API and Field API is no exception. This session outlined key aspects of Entity Field API in Drupal 8, some of which are summarised below.

The Entity Field API unifies the following APIs/features:

  • Field translation
  • Field access
  • Constraints/validation
  • REST
  • Widgets/formatters
  • In-place editing
  • EntityQuery
  • Field cache/Entity cache

Many field types are now included in core, removing the need to enable separate modules: for example, email, link, phone, date and datetime, and, best of all, entity reference are now in core. Entity reference being in core allows for some very neat chaining of entities:


And you can get a taxonomy term with:


All text fields now support in-place editing out of the box too, without the need for additional modules. Even in-place editing of the title is now possible.

Since fields can be attached to block entities in Drupal 8, fieldable blocks are now provided out of the box.

We also get “Form modes” in Drupal 8, which are similar to view modes where you can change the order and visibility of an entity type’s fields for multiple forms. In Drupal 7 you only have one add/edit form available, which leads to nasty workarounds for entities such as those required to provide different user edit and user registration forms for the user entity. “Form modes” also makes it much easier to have alternate create and edit forms and to hide fields in forms, especially using the Field Overview UI which works along the same lines as the existing view modes UI.

Comment is now a field, which means you can have comments on any entity type.

In Drupal 8, everything is now an entity. There are two types of entity: configuration entities and content entities. Content entities are revisionable, translatable and fieldable. Configuration entities are stored to configuration management, cannot have fields attached and include things like node types, views, image styles and fields themselves. Yes, fields are entities!

Entities now have the full CRUD capability in core. They are classed objects making full use of interfaces and methods rather than having wrapper functions as in Drupal 7.

The following code example shows how nodes are now handled:
$node = Node::create(array(
'type' => 'page',
'title' => 'Example',
$id = $node->id();
$node = Node::load($id);

A newly created node has to be saved before it exists in the database.

Interfaces are now used to extend a base entity interface when creating custom entities:
$node implements EntityInterface
$node implements NodeInterface
NodeInterface extends EntityInterface

This means you have common methods across all entities:
if (!$entity->access('view')) {
// ...

Having validation as a method in Drupal 8 separates it from form submission and also allows easier validation through REST APIs.

You can have specialised methods for specific entity types:
$node = Node::load($id);
if (!$node->isPublished()) {

There is built in translation support in Drupal 8, which allows the translated output of all fields on an entity to be handled much more easily than is currently possible:
$translation = $node->getTranslation('de'); $translation instanceof NodeInterface;
$translation->language()->id == 'de';
$entity = $translation->getUntranslated();

In Drupal 8, $node->body[LANGUAGE_NONE][0]['value']; becomes $node->body->value;. Much neater!

For multiple instances of a field, you can specify the delta with $node->body->get(0)->value or $node->body[0]->value

There is a cheat sheet for the new Entity Field API, available at

All in all, these example demonstrate how changes to the Entity Field API in Drupal 8 will make for much cleaner, more readable, and more maintainable code.

An ERASMUS study visit

We have said goodbye to our visitor from the University of Trento, Mauro Ferrari.  Mauro is a web application developer and used the EU’s ERASMUS scheme to fund a two-week study visit to learn how we develop software and in particular how we implement the University’s portal, MyEd.

Mauro’s highlights were:

  • Learning about MyEd, particularly Martin Morrey’s EDUCAUSE talk on analytics
  • Talking about our project methods, especially agile projects
  • Seeing how we use development tools such as JIRA and Bamboo
  • Learning about the Drupal Features module for managing changes in Drupal modules

Mauro also complimented us on our attention to the experience of users and our commitment to migrate data across versions.  He was interested to learn about SAP BI Suite and could see how it would help his University but thought that this would be beyond his team’s current capabilities.

Mauro was more critical of some aspects of the user experience in MyEd.  One example he gave was the way that the whole page redraws when a user changes something in the Event Booking portlet.  He also thought the list of available portlets was hard to scroll.  He gave demos of the Trento portal to several of us; there may be lessons that we can learn from their work.

I was interested to learn of Trento’s approach to managing identities with multiple roles.  Each of their systems prompts you to choose your role when you log in, so you have a single identity and can select which role to use if more than one applies.  Their portal allows you to group all your portlets regardless of role.  This would be a big change for us and I am not suggesting that we change tack, just noting that it was interesting to see the different approach.

Mauro also demonstrated their system for creating and managing applications, which covers everything from Doctoral positions to summer school places to public lectures to internal events and more.  Basically it is a sophisticated form editor with a back-end that lets organisers check applications and so forth.  It clearly works for Trento; for us I think the question it raises is whether a central service of this sort would be useful.  Such a service would combine Events Booking, (use of) EventBrite for public events, OLLBook for evening courses, and possibly more.  I don’t see this as a priority but again it was interesting to compare the approaches.

My overall lesson from his visit is that we are a very effective and mature organisation with much to teach other universities.  Which is not to say that we know everything or that we cannot learn from other universities in return.

I would like to thank everyone who gave their time to talk to Mauro for helping to create a successful visit for our guest. I also thank Mauro for choosing us for his study; we were very pleased to be his hosts.

A new Course Timetable Browser

tileLast week we released a new Course Timetable Browser (CTB), aimed to let students and staff select a bunch of courses and see how they might timetable together. This is aimed to complement the Web Timetables provided by Scientia, and provide some similar functionality to the retired Timetab (a student favourite). You can access the Course Timetable Browser at the link below:

Continue reading “A new Course Timetable Browser”

SITS Technical Working Group and STUTALK User Group Meetings

The SITS Technical Working Group and STUTALK User Group meetings take place every six months. Every time meetings are held in different locations and are organized by different Universities. The last meeting took place on the 22nd and 23rd of May and was hosted by the University of Bristol. We had two representatives from the Student System Partnership Team (SSP) – Tomasz Pogoda and Jon Martin.
The purpose of the Technical Working Group meeting was to discuss issues related to the SITS software updates, new features of the latest software version 8.7.1, de-suported components and the Document Manager tool.
On the STUTALK User Group Meeting we had a chance to discuss – What are institutions using STUTALK for and what they have done in terms of the STUTALK implementation. Also a new features and the road map were presented by the Tribal.

On both meetings there was a number of presentations delivered. There were three presentations delivered by Tribal:

Email Configuration, troubleshooting and attachments – In this presentation a new email features which will be available from version 8.7.1 have been presented.
E:Vision, HTML5, Accessibility & Responsive Design– Tribal The overview of new features related to the Mobile strategy, responsive design and HTML5 has been presented.
STUTALK 2.0 – in this presentation the features of the new version of STUTALK have been presented.

There was one presentation delivered by the Kings College London:

STUTALK and mobile timetabling – presentation given by Kings College London.

We also had a chance to share our recent experience with Bootstrap/Jquery/CSS stylesheet within our SITS configuration. The  topic of our presentation was: Improving the e:Vision user Interface using JQuery and Bootstrap.
We have recived a very positive feedback and most of the institutions were highly interested in our experience.

The presentation slides BootstrapTWG_Edinburgh_University and minutes from both meetings STUTALK User Group Meeting  Technical Working User Group Meeting

X-ray Tomography Control

The School of Geosciences has an X-ray computed tomography (CT) scanner installed within one of its basement laboratories, which researchers within the school have assembled from individual components. The scanner consists of a number of independent elements, including an X-ray tube and corresponding camera, air table for holding and rotating samples and PCs for controlling the X-ray tube and handling image capture from the camera. Custom software to control the image capture process was written by a PhD student, and this software was driven by Testpoint (which also controls the table on which the sample sits).


The original implementation was based around a Windows 98 installation, and while this solution was rock-solid, time taken for the image capture process was significantly longer than ideal, at around 6 seconds per image (with a 0.5-1.5 second exposure time). Additionally, with the existing high resolution camera displaying worsening artefacts from X-ray exposure, a more modern camera was purchased and required to be integrated into the image capture process.

CT SCreenshot

IS Apps successfully bid on the work to replace the existing PC and software, as an alternative to either purchasing a complete pre-assembled solution, or outsourcing the work to a third party. Limitations of driver availability for the hardware used meant Windows XP was the most recent OS which was suitable for the task, although fortunately this posed relatively few difficulties bar sourcing a new license!

A new C++ application was developed, using the Microsoft Foundation Class library to provide the user interface. C++ was chosen as both the original image capture application, and the majority of worked examples for the hardware, were written in C++. The desire to improve performance, and memory usage of the image capture process, also indicated towards C++ for the fine-grain control the language provides.

The interface was streamlined to eliminate unused options, the image processing options were expanded to clarify the available options, and a control added to allow the camera hardware to be chosen. The image capture process itself was encapsulated within a new dialog window which is displayed when the process is running, and contains the progress indicators and status display (start time, estimated end time, number of images captured, etc.).

CT SCreenshot 2To enable development without requiring constant access to the hardware, support for “dummy” camera and table hardware was added, where the application emulates the relevant hardware internally. As the components operate independently of each other, the user interface, table and camera all operate on individual threads within the application, ensuring that interaction with any of these three elements is processed as close to real time as possible. Communication within the elements is handled primarily via MFC message pumps from the hardware threads to the UI thread, with semaphores and events used to pass data down to the hardware threads where required (most hardware threads as given a fixed task to complete, and are expected to return only when it has finished).

This revised coupling of the hardware control, in combination with reduction in how frequently the cameras require to be re-initialised during an image capture, reduced image capture overheads to a fraction of their previous times. Early estimates are around a tripling in performance, with actual exposure time taken by the camera now dominating time taken.

An example of the resulting rendered “slices” of a section of meteorite, after reconstruction in Octopus, is shown below:


Obviously we’re very pleased with the project results, and hope this provides a clear illustration of the level of work IS Apps is capable of, as well as the cost benefits of in-sourcing of applications.