Geodata Server on R@CMon

The Australian Bureau of Statistics (ABS) provides public access to internet activity data as “data cubes” under the catalog number “8153.0”. These statistics are derived from data provided by  internet service providers (ISPs) and offer an estimate of the number of users (frequency) having access to a specific Internet technology such as ADSL. While this survey is adequate for general observations, the granularity is too coarse to assess the impact of internet access on Australian society and economic growth. The Geodata Server project led by Klaus Ackermann (Faculty of Business and Economics, Monash University) was created with an aim to provide significantly enhanced granularity on Internet usage, in both the temporal and spatial dimensions for Australia on a local government level and for other cities worldwide.

IPv4 Heatmap and Project Background, Ackermann, Angus & Raschky: Economics of Technology, Wombat 2016

One of the main challenges in the project is the analysis of 1.5 trillion observations from the ABS data sets. The project requires high-performance and high-throughput computational resources to analyse this vast amount of data. Also, a reasonable amount of data storage space is vital for storing reference and computed data. Another major challenge is how to architect the analysis pipeline to fully utilise the available resources. Over the last 3 years, several iterations of the methodology as well as infrastructure setup have been developed and tested to optimise the analysis pipeline. The R@CMon team engaged with Klaus to address the various computational, storage and analysis requirements of the project. A dedicated NeCTAR project has been provisioned for Geodata Server, which includes the computational resources to be used on the Monash node of the NeCTAR Research Cloud. Computational storage was provisioned to the project via VicNode allocation scheme.

Processing Workflow on R@CMon, Ackermann, Angus & Raschky: Economics of Technology, Wombat 2016

With the computational and storage resources in place, the project was able to progress with the development of the analysis pipeline based on various “big data” technologies. In coordination with the R@CMon team, several Hadoop distributions have been evaluated, namely, Cloudera, MapR and Hortonworks. The latter was chosen for its  ease of installation and 100% open source commitment. The resulting cluster consists of 32 cores with 8TB of Hadoop Filesystem (HDFS) storage divided among 4 nodes. Tested configuration includes 16 cores and 2 nodes or 32 cores and 1 node. The data has been distributed into 2TB volume drives. The master node of the cluster has an extra large volume attached to store the raw (reference) data. To optimize the performance of the distributed HDFS, all loaded data is stored in compressed Lempel–Ziv–Oberhumer (LZO) format to reduce the burden on the network, that is shared among other tenants on the NeCTAR Research Cloud.

Multi City Analysis, Ackermann, Angus & Raschky: Economics of Technology, Wombat 2016

Through R@CMon, the Geodata Server project was able to successfully handle and curate trillions of IP-activity observational data and link these data accurately to its geo-location in single and multi-city models. Analysis tools were laid down as part of the pipeline from high-performance (HPC) processing on Monash’s supercomputers, to Hadoop-like type of data parallelisation in the research cloud. From this, preliminary observations suggest strong spatial-correlation and evidence of political boundaries discontinuities on IP activities, which suggests some cultural and/or institutional factors. Some of the models produced from this project are currently being curated in preparation for public release to the wider Australian research community. The models are actively being improved with additional IP statistical data from other cities in the world. As the data grows, the analysis pipeline, computational and storage requirements are expected to scale as well. The R@CMon team will continue to support the Geodata Server project to reach its next milestones.

Worm Strains Catalogue on R@CMon

Associate Professor Roger Pocock is the head of the Neuronal Development and Plasticity Laboratory at Monash University. Roger’s lab investigates the various fundamental mechanisms that factors in brain development using the Caenorhabditis elegans organism as a model system. Roger joined Monash University in 2014, bringing with him a comprehensive catalogue of worm strains data that has been carefully curated for years from his previous laboratory at the University of Copenhagen. The strains catalogue is held in a FileMaker database, that the laboratory members regularly update and query for current and new strains’ entries.

C. elegans as a model system, Neuronal Development and Plasticity Laboratory

FileMaker (and its derivatives) is commercial software for creating custom applications for a variety target platforms (e.g. web, iPad, Windows, Mac). The worm strains catalogue from Roger’s lab was the first FileMaker-based database deployment on R@CMon. The R@CMon team were able to install and configure a fully-licensed and latest version of FileMaker Pro on the Monash node of the NeCTAR Research Cloud inside a dedicated tenancy (i.e. computational and storage resources) provisioned for the lab. The FileMaker software itself has been deployed on a Monash-licensed Windows Server instance, which has access to the latest system and security updates from Microsoft.

Worm Strains Catalogue Entry

The FileMaker WebDirect feature has been enabled on the new server to allow easy access to the strains catalogue from standard web browsers via internet, without any need for additional programming or software installation on the user’s client machine. Proper HTTPS have been prepared and enabled on new the WebDirect interface. Since then, and with the ongoing support of R@CMon, the catalogue has grown to include  external collaborators’ models that are derived from other strains.

Monash Connections Online

The Monash University Library’s Special Collections are a large and special compilation of various media like rare books, music and multimedia in various forms and languages such as Slavic, Asian, Yiddish and Jewish. These collections are considered among the most comprehensive in whole of Australasia. Hosted on legacy infrastructure, the original collection’s web presence has recently become a maintenance challenge for library administrators due to its legacy hardware and software stack. There was also a push in early 2016 to centralise the university’s data centre infrastructure, where the legacy collections platform is being hosted. This presented an opportunity for Monash University Library to to migrate the collections onto one of the latest community-supported public collections publishing platforms.   After evaluation the open-source, freely available   Omeka  LAMP stack was chosen for the new platform.

Monash Collections Online Main Page

The R@CMon team engaged the various library stakeholders to spin-up a test instance of Omeka on the Monash node of the NeCTAR Research Cloud. The team at the library tested the various hosting and publishing capabilities of Omeka, including installation and integration of custom themes and plugins (e.g multimedia playback plugins). After several consultations, demonstrations and rigorous testing between the R@CMon and library teams, the executive decision has been made for Omeka to be the new publishing and showcasing platform for the library’s special collections.

Monash Collections Online Tall Tales and True Exhibition

The R@CMon team deployed a highly-available instance of Omeka on the NeCTAR Research Cloud utilising the LAMP stack plus HAProxy. Through VicNode, a dedicated and accessible storage share has been provisioned for the collections, housing the various types of files and media for current and future public showcases and exhibitions. The newly minted Monash Collections Online platform has been officially unveiled at the start of 2017 and is now publicly available. The new platform is also regularly updated with new content by the library team. Since its release, the R@CMon team continues to support the new platform through standard and regular engagements with the Monash University Library.

Monash Data Science on R@CMon

Back in 2015, the Faculty of Information Technology at Monash University has started exploring various data science platforms that are easily available on the web. Many of its researchers including lecturers have used interactive Python and R notebooks on their own desktops and laptops for small and medium-size kind of problems. These interactive notebooks provides ease-of-use, portability and collaboration tools. Very useful features that the faculty decided to use them for teaching and have the software stack installed on the teaching labs computers. Data science courses can then be done on these labs where students run their analyses on the notebook instances running on each lab machines. For sometime, this setup has served the faculty’s teaching requirements really well, but as the number of students grow and more advanced and complex problems are tackled, it has become apparent, that a more scalable and highly-available data science platform is needed.

Training Dataset Visualisation in JupyterHub

The R@CMon team started a journey with the faculty’s staff to evaluate the already available data science platforms. The team first deployed SageMathCloud (SMC) on the Monash node of the NeCTAR Research Cloud and assessed it for a couple of months. SageMath and its cloud version – SageMathCloud (SMC) are open-source platforms for mathematical and scientific analyses. It provides a similar intuitive and interactive interface for running models and generating visualisations. The most attractive feature of SMC is that it’s been developed as a teaching platform from the outset, so various plugins for teacher-student interactions were already developed and available, for example: notebook sharing and marking. Although SMC is open-source,  the R@CMon team encountered various setup and deployment issues. The team was able to deploy a basic setup of SMC eventually with key features. The developers and maintainers of SMC have been consulted for support but didn’t at that time support private deployments. The next available data science platform was then assessed.

Samples Distribution Visualisation in JupyterHub

The team then moved on to evaluate IBM’s Data Science Workbench (DSW) platform. DSW is not open-sourced and cannot be deployed privately on the research cloud, but at that time, DSW had the requisite analytic  (e.g. Python, R) and collaboration features.  DSW was used by the faculty to deliver a number of teaching courses. However, after several rounds of teaching courses, licensing issues caused teachers and students to be unable login to DSW, as well as running notebooks crashing.  These issues led the faculty to resume the search for another data science platform.

Features Correlation Visualisation in JupyterHub

JupyterHub is a multi-user system for serving interactive notebooks. It provides a comprehensive documentation for various type of deployments and scaling options. Since its inception, JupyterHub has become mainstream in various teaching and research communities. For example, there were some early adopters of JupyterHub for education from UC Berkley. JupyterHub has been used also to provide a publicly accessible and re-runnable model in Nature. These early adopters inspired the R@CMon team and faculty staff to replicate their success stories in the then being developed online course of Graduate Diploma for Data Science.

R Classification Visualisation in JupyterHub

The R@CMon team deployed an instance of JupyterHub locally on the Monash node of the NeCTAR Research Cloud. The team then coordinated with the relevant lecturers for the configuration of various Python and R libraries (e.g numpy, scipy, ggplots, matplotlib) that will be used for the units. To support a more dynamic user management of JupyterHub, the R@CMon team has integrated it with the Monash User Directory service. This enabled easier addition and removal of users from the system, plus users can use their own Monash credentials to access JupyterHub and do their analysis. To date, and after ~2 years of usage, the R@CMon-hosted JupyterHub service has gone several rounds of teaching periods and served hundreds of students. The R@CMon team is actively engaging with the faculty for future directions in delivering new content (e.g. PySpark) and preparing for the next and more exciting forms of interactive analyses (e.g JupyterLab).

More FLAIR to Fluid Mechanics via the Monash Research Cloud

Advanced research in engineering can often benefit from extra compute capacity. This is where a research-oriented computational cloud like R@CMon is very handy. We report on the use of cloud resources to augment the resources available for running large-scale fluid mechanics studies.

FLAIR (Fluids Laboratory for Aeronautical and Industrial Research), from the Department of Mechanical and Aerospace Engineering, Faculty of Engineering, has been conducting experimental and computational fluid mechanics research for over twenty years, focusing on fundamental fluid flow problems that impact the automotive, aeronautical, industrial and biomedical fields.

A key research focus in recent years has been understanding the wake dynamics of particles near walls. Particle-particle and wall-particle interactions were investigated using an in-house spectral-element numerical solver. Understanding these interactions is key in many engineering industries. When applied to biological engineering, blood cells / leukocytes are numerically modelled as canonical bluff bodies (i.e., as cylinders and spheres) and numerical computations are carried out. These simulations are not only useful in understanding biological cell transport but have wider applications in mineral processing, chemical engineering and applications in ball sports. Due to the computational and data-intensive nature of this research, it has always been a challenge to get access to sufficient computing resources for its needs.

In particular, their project aims to understand the wake dynamics on multiple particles in various scenarios such as rolling, collisions and vortex-induced vibrations; and the resultant mixing which occurs as a result of these interactions, etc. The group’s two- and three-dimensional fluid flow solver also incorporates two-way body dynamics to model these effects. As the studies involve multiple parameters such as Reynolds number, body rotation, height of the body above the wall, etc, the total parameter space is extensive, requiring significant computational resources. While the two-dimensional simulations are carried out on single processors, their three-dimensional counterparts require parallel processing, making NeCTAR nodes an ideal platform to run these computations. Some of the visualisations from the group’s three-dimensional simulations are shown in Figures 1 and 2 below.

Since 2008, the FLAIR team has been making good use of the Monash Campus Cluster (MCC), a high-performance/high-throughput heterogeneous system with over two thousand CPU cores. However, MCC is heavily utilised by researchers from across the university and FLAIR users often found themselves waiting long periods before they could run their fluid flow simulations. It became clear that FLAIR researchers needed additional computational resources.

R@CMON was able to secure a 160-core allocation to the FLAIR team, which increased valuable resources for the group. Now, thanks to both NeCTAR and MCC-R@CMon, over one million CPU hours distributed across 4,000 jobs were provided for the project’s CPU-intensive calculations.

This has resulted in a number of publications in the highest impact fluid mechanics journals, with several more in a pre-submission stage; for example:
  • Rao, A., Thompson, M.C., & Hourigan, K. (2016) “A universal three-dimensional instability of the wakes of two-dimensional bluff bodies.” Journal of Fluid Mechanics, 792, 50-66.
  • Rao, A., Radi, A., Leontini, J.S., Thompson, M.C., Sheridan, J., & Hourigan, K. (2015) “A review of rotating cylinder wake transitions.” Journal of Fluids and Structures, 53, 2–14.
  • Rao, A., Radi, A., Leontini, J.S., Thompson, M.C., Sheridan, J., & Hourigan, K. (2015) “The influence of a small upstream wire on transition in a rotating cylinder wake.” Journal of Fluid Mechanics (published online) 769 (R2), 1-12. DOI
  • Rao, A., Thompson, M.C., Leweke, T., & Hourigan, K. (2013) “The flow past a circular cylinder translating at different heights above a wall.” Journal of Fluids and Structures, 41, 9–21.
  • Rao, A., Passaggia, P.-Y., Bolnot, H., Thompson, M.C., Leweke, T., & Hourigan, K. (2012) “Transition to chaos in the wake of a rolling sphere.” Journal of Fluid Mechanics, 695, 135-148.

Figure
Figure

The Monash Country Lines Archive on R@CMon

The Monash Country Lines Archive (MCLA) is a collaborative project between the Monash Indigenous Centre (MIC), Faculty of Arts and the Faculty of Information Technology with a team of researchers, digital animators and students. The MCLA aims to support the indigenous Australian communities in the preservation of stories that combine their history, knowledge, poetry, songs, performance and language. MCLA began working with the Yanyuwa people of Borroloola, NT, creating a number of animations between 2007 and 2010. It was these animations that caught the attention of Dr Alan Finkel, the then Chancellor of Monash University. In 2011, the Alan and Elizabeth Finkel Foundation supported the project for a further five years.

Render from “Why We All Die” 2015 ©MCLA & Taungurung Dolodanin-dat Animation Group.

Since its foundation in 2011, the MCLA has produced nine short-form animated films ranging from four minutes to twenty-four minutes in length while working with the communities through every step of the animation process; script, storyboards, character and landscape concepts and construction, animation, rendering, sound and post-production. Initially, producing these animations was challenging due to their heavy computational requirements. The MCLA team didn’t have access to any dedicated render-farm resources as would be normal for a commercial animation studio, so all rendering works were done on individual desktops and laptops. This resource limitation forced the MCLA team to compromise advanced rendering techniques in order to quickly render a large number of scenes while still maintaining a certain level of production quality.

Render from “Jibi the Giant Spirit Birds” 2013 ©MCLA & Nyamba Buru Yawuru.

Render frame from “Jibi the Giant Spirit Birds” 2013 ©MCLA & Nyamba Buru Yawuru.

In 2013, the MCLA team gained access to the NeCTAR Research Cloud, giving them a much needed rendering capacity boost. The R@CMon team assisted the MCLA in deploying and dynamically scaling their workflow into a distributed rendering workflow in the research cloud. Modelling, animation and rendering software have been licensed and configured on this virtual render farm. The farm has been configured so that MCLA can easily access it remotely to submit jobs and inspect their renders. The MCLA then started applying advanced rendering techniques in their workflow, techniques that weren’t possible on their previous setup. After several years of usage, demands for MCLA to produce more and more high quality visualisations also increased. This required the render farm to scale more, much more, and it did.

Render frame from “Janyju the Red Lizard” 2014 ©MCLA & Nyamba Buru Yawuru.

Render frame from “Janyju the Red Lizard” 2014 ©MCLA & Nyamba Buru Yawuru.

Access to the research cloud-backed render farm removed a huge limitation for the MCLA, inspiring them to produce more animations for the indigenous Australian communities without compromising on quality. The R@CMon team will continue to support the MCLA going forward and will be there when the time comes that the farm needs more power. The MCLA is composed of Dr John Bradley, Dr Shannon Faulkhead, Brent D McKee, Dr Tom Chandler and Chandara Ung.

This slideshow requires JavaScript.

Monash Macromolecular Crystallisation Facility Upgrade on R@CMon

The Monash Macromolecular Crystallisation Facility (MMCF), a Monash Technology Research Platform, was established in 2009 and is operated by the Structural Biology Unit at Monash University. The MMCF provides access to a fully automated platform for the high-throughput crystallisation of biological macromolecules. Macromolecular crystallography provides unparalleled details of 3D structure of biological macromolecules and provides the basis for the rational design of therapeutics. The MMCF is considered to be the largest Macromolecular Crystallisation Facility in the world.

The Monash Macromolecular Crystallisation Facility

The MMCF partnered with Formulatrix, Monash eSolutions and R@CMon to upgrade the facility’s IT infrastructure for the next-generation of crystallisation technology. The R@CMon team provisioned a custom Microsoft Windows-based infrastructure on the Monash node of the NeCTAR Research Cloud for hosting the platform’s new crystallisation and imaging system. An enterprise database has been configured and maintained by Monash eSolutions to support this new system. The facility’s networking infrastructure has been completely revamped by Monash eSolutions too. The R@CMon team worked with the vendor, Formulatrix to deploy the instrument’s software stack. A dedicated research data storage has been provisioned for the facility’s experiments and imaging data.

Protein Crystals Imaging in Action

Protein Crystals Imaging in Action

The R@CMon team and Monash eSolutions are working together to support the facility going forward. The Monash Macromolecular Crystallisation Facility (MMCF) project story first appeared on  Monash University’s The Insider.