MyEd Updates 2014/5

One of the major projects over summer 2014 has been the update to MyEd. Behind the scenes we’ve moved from uPortal 3 to 4, although for most users the clearest changes are the excellent work done by Learning Teaching & Web on the new theme. The migration itself has taken months of effort, with many portlets (applications running within MyEd) essentially requiring to be completely rewritten for the new version. The configuration of the two systems are not directly compatible, and tools had to be developed to update and maintain configurations for over 100 channels (the small sectons seen by the user, such as “Learn” or “Staff Details”) across three different environments (development, test and live), testing each of these changes both in isolation and integrated into the complete system.

Many of these channels also depend on other applications (such as accommodation, event booking, Learn, etc.) which in some cases needed to be modified and those modifications tested. Extensive load testing was performed to ensure the systems would handle the very high load anticipated for any major university service at the start of term. Hopefully this helps to give an idea of the scale of the project.

So what next for MyEd? Mobile support was disabled in the current deployment, but a  project is currently underway to add support for mobile devices for a number of core parts of MyEd. I’m sure many will be pleased to know this is expected to include both campus maps and timetabling, with email, calendar, Learn and a number of other tools available at launch. Naturally both iPhone and Android platforms will be supported, with full details to follow.

X-ray Tomography Control

The School of Geosciences has an X-ray computed tomography (CT) scanner installed within one of its basement laboratories, which researchers within the school have assembled from individual components. The scanner consists of a number of independent elements, including an X-ray tube and corresponding camera, air table for holding and rotating samples and PCs for controlling the X-ray tube and handling image capture from the camera. Custom software to control the image capture process was written by a PhD student, and this software was driven by Testpoint (which also controls the table on which the sample sits).


The original implementation was based around a Windows 98 installation, and while this solution was rock-solid, time taken for the image capture process was significantly longer than ideal, at around 6 seconds per image (with a 0.5-1.5 second exposure time). Additionally, with the existing high resolution camera displaying worsening artefacts from X-ray exposure, a more modern camera was purchased and required to be integrated into the image capture process.

CT SCreenshot

IS Apps successfully bid on the work to replace the existing PC and software, as an alternative to either purchasing a complete pre-assembled solution, or outsourcing the work to a third party. Limitations of driver availability for the hardware used meant Windows XP was the most recent OS which was suitable for the task, although fortunately this posed relatively few difficulties bar sourcing a new license!

A new C++ application was developed, using the Microsoft Foundation Class library to provide the user interface. C++ was chosen as both the original image capture application, and the majority of worked examples for the hardware, were written in C++. The desire to improve performance, and memory usage of the image capture process, also indicated towards C++ for the fine-grain control the language provides.

The interface was streamlined to eliminate unused options, the image processing options were expanded to clarify the available options, and a control added to allow the camera hardware to be chosen. The image capture process itself was encapsulated within a new dialog window which is displayed when the process is running, and contains the progress indicators and status display (start time, estimated end time, number of images captured, etc.).

CT SCreenshot 2To enable development without requiring constant access to the hardware, support for “dummy” camera and table hardware was added, where the application emulates the relevant hardware internally. As the components operate independently of each other, the user interface, table and camera all operate on individual threads within the application, ensuring that interaction with any of these three elements is processed as close to real time as possible. Communication within the elements is handled primarily via MFC message pumps from the hardware threads to the UI thread, with semaphores and events used to pass data down to the hardware threads where required (most hardware threads as given a fixed task to complete, and are expected to return only when it has finished).

This revised coupling of the hardware control, in combination with reduction in how frequently the cameras require to be re-initialised during an image capture, reduced image capture overheads to a fraction of their previous times. Early estimates are around a tripling in performance, with actual exposure time taken by the camera now dominating time taken.

An example of the resulting rendered “slices” of a section of meteorite, after reconstruction in Octopus, is shown below:


Obviously we’re very pleased with the project results, and hope this provides a clear illustration of the level of work IS Apps is capable of, as well as the cost benefits of in-sourcing of applications.