The Rise of Engineering-Driven Analytics

9:20–10:00 a.m.

Engineering data has become essential in business-critical systems and applications. Audio, image, real-time video, motion, machine performance metrics, and other sensor-generated data is combined with business, transactional, and other IT data to create opportunities for sophisticated analytics on more complex phenomena. The flexibility to run those analytics, either on massive data sets in IT or cloud infrastructures or as the data is acquired on smart sensors and embedded devices, enables organisations in many industries to develop intelligent products, devices, and services that expand the business impact of their data and analytics. This talk provides numerous examples of this in action and discussed new capabilities in MATLAB® and Simulink®. Design and develop these systems and be a leading force in this new analytics-driven age.

Michelle Hirsch, MathWorks
Stéphane Marouani, MathWorks

Introduction to Data Analytics with MATLAB

8:45–9:15 a.m.

Attend this session to learn how MATLAB® can take you beyond Microsoft® Excel®. Automate your analysis workflows with thousands of prebuilt mathematical and advanced analysis functions and versatile visualisation tools. Through product demonstrations, see how to:

  • Access data from files and Excel spreadsheets
  • Visualise data and customise figures
  • Perform statistical analysis, machine learning, optimisation, and predictive modelling
  • Generate reports and automate workflows
  • Share analysis tools as standalone applications, websites, or Excel add-ins

This session is intended for people who are new to MATLAB.

David Willingham, MathWorks

What’s New in MATLAB in R2015b and R2016a

10:00–10:30 a.m.

In this session, David covers the latest features and new toolboxes from the past year:

  • New execution engine that runs MATLAB® code faster
  • New Live Editor for developing live scripts, creating an interactive narrative, editing symbolic code, and visualizing results
  • New App Designer for building MATLAB apps with an enhanced design environment and expanded component library
  • Deep learning with convolution neural networks (CNNs) for image classification tasks
  • Classification Learner app that trains multiple models automatically
David Willingham, MathWorks

Assessing Victoria's Native Vegetation Clearing Requirements Using MATLAB

11:00–11:30 p.m.

In Victoria, a permit is required to remove, destroy, or lop native vegetation. These regulations are known as the native vegetation permitted clearing regulations. A risk-based approach is used to determine what the requirements for a clearing application are. The combination of a location risk and an extent risk determines the risk-based pathway for the clearing application. When the risk-based pathway is determined to be moderate or high, an assessment of the impact on Victorian rare and threatened species is required.

The determination of the impact of a clearing on the Victorian rare and threatened species is a non-trivial problem which needs to be solved for every clearing application submitted to the Department of Environment, Land, Water and Planning. The core calculation is determining the degree to which clearing a given polygon impacts an individual species model. If the impact is above a certain threshold, the specific offset requirements for that species must be determined. This process needs be repeated for all species models (currently 1733). A reasonable turnaround time is required to ensure that all applications are processed in a timely manner.

The department has developed a solution using MATLAB® to process these applications. This presentation discusses some of the complexities which have to be solved, including working with large data sets, polygon intersections, parallel processing, interfacing with external software (Microsoft® Word and Excel), MEX files, version control, and deploying to end users within and outside the department. Products used include MATLAB, Mapping Toolbox™, Parallel Computing Toolbox™, MATLAB Compiler™, and MATLAB Production Server™.

Gordon Forbes, Department of Environment, Land, Water and Planning

Brain Waves

11:00–11:30 a.m.

Unifying theories of natural phenomena, expressed in mathematical form, have had enormous success in science, essentially “solving” the problems of electromagnetism, gravity, thermodynamics, and particle physics. Could the processes that prescribe the dynamics of the brain’s physical states be governed by a closed set of equations that could be discovered and written down? What might the equations for the brain look like? How will they be obtained? Will they "solve" neuroscience?

QIMR combine advanced analysis of brain imaging data with computational modelling to study the mathematical structure of connections in the brain and the complex neuronal dynamics they support. In this presentation, Michael shows how MATLAB® was used in this work.

Michael Breakspear, QIMR Berghofer Medical Research Institute

Measuring Fluorescence Signal Colocalisation and Quantification in Biological Systems Using MATLAB for Image Processing

11:00–11:30 a.m.

Children’s Medical Research Institute (CMRI) was Australia’s first dedicated paediatric medical research facility and has been helping to save the lives of children for over 57 years. At CMRI, fluorescence microscopy is routinely used to visualise protein, DNA, and cell structures labelled with different fluorophores. The degree of signal overlap between the different channels is analysed in the resultant images, and this serves as a measure for colocalisation of the biological entities labelled by the fluorophores. Researchers were performing this colocalisation work manually, which was time consuming, tedious, and prone to human error.

Using Image Processing Toolbox™ they have developed a novel method to automatically identify regions of fluorescent signal on two channels, identify the colocated parts of these regions, and calculate the statistical significance of the colocalisation. Using GUIDE, a user interface is developed to visualise signal colocalisation and fine-tune user-defined parameters for the colocalisation analysis, including the application of median or Wiener filtering to improve the signal-to-noise ratio. Command-line execution allows batch processing of multiple images. Users can also calculate the statistical significance of the observed signal colocalisations compared to overlap by random chance using the Student’s t-test. Our validation revealed a highly significant correlation between manual and automatic identification of colocalisations. Therefore, our automatic method has the ability to replace manual colocalisation counting, and the potential to be applied to a wide range of biological areas.

Matloob Khushi, Children’s Medical Research Institute

Mineral Process Plant Schedule Optimisation

11:00–11:30 a.m.

Many processing plants contain multiple internal constraints. As ore feed grades and properties change over the life of mine, processing bottlenecks shift from one-unit process to another and different operating strategies need to be adopted to maximise cash generation from the process.

While many mine scheduling tools exist, there are limited process tools used for schedule optimisation. Mining definition of ore is either on a per mining block basis, with value assigned to each block, or on a schedule type basis where ore tonnage and properties are scheduled for processing over time. A process scheduling tool needs to select from the available mining inventory and adjust process plant conditions to achieve an optimum outcome (either in the form of plant production or a cost objective). A commonly overlooked factor are the internal process plant constraints which can only be determined during ore selection and need to be satisfied at the optimum condition.

MATLAB® was used to develop a function that describes a full process plant mass balance according to the major unit operations. This function returns a user-selected objective function and non-linear constraint conditions based on a given set of ore feeds, unit operation equipment availabilities, and plant configuration or operating conditions. A script was developed to read in mining data and process assumptions from a Microsoft® Excel® file and to manage the ore inventory of the scheduled. The optimisation toolbox was used to determine an optimum condition for each interval in the given schedule that satisfied a set of linear and non-linear constraints.

This optimisation process has been applied to different complex gold operations at Newcrest. The end result is an automated scheduler that selects the optimum blend of ores as well as plant operating conditions that satisfy all internal plant and shut-down constraints while maximising the objective of choice (metal production or cash-flow). The models are also valuable for evaluating expansion or de-bottlenecking opportunities of individual processing units. These models have also been deployed to end users in a compiled version with significant savings in time to run different scenarios over the Excel-based systems. Other benefits in switching to a platform using MATLAB include more transparent auditing of the model assumptions, reliability of the output, quick turn-around of multiple scenarios, and ease of adding additional schedule constraints as conditions change over time.

David Seaman , Newcrest Mining Ltd.

What’s New in Simulink in R2015b and R2016a

11:30 a.m.–12:00 p.m.

This session covers recently added capabilities in the Simulink® product family for Model-Based Design. The added capabilities cover the cornerstones of Model-Based Design including plant modelling, control design, real-time testing, automatic code generation, and verification and validation activities.

Ruth-Anne Marchant, MathWorks

Analysis of Array Gain for a Circular HFDF Antenna Array

12:00–12:30 p.m.

Project Nullarbor provides a new High Frequency Direction Finding (HFDF) capability for the Defence High Frequency Communication System (DHFCS) for the Australian Defence Force. One of the requirements of the project is to provide an HFDF system that is externally noise limited over the full frequency range of the system, in this case 2 MHz to 30 MHz. A system is externally noise-limited if the environmental noise is higher than the system noise floor. Some of the key factors that determine if a system is externally noise-limited are antenna element gain, antenna element return loss, array gain, and receiver noise floor.

The Nullarbor Project is building two HFDF arrays at the four DHFCS receiver sites. These comprise a low-band array with 13 m elements covering the frequency range of 2 MHz to 10 MHz and a high-band array with 6.5 m elements covering the frequency range of 10 MHz to 30 MHz. Each of these elements is an offset centre fed monopole, and their individual gains are limited by their return loss. Further when used in an array, the array gain also needs to be determined from the phasing applied to the individual elements.

The individual antenna element gains and the return loss of the elements were determined using the numerical electromagnetic code (NEC4) provided by the Laurence Livermore Laboratory. MATLAB® was used to extract the relevant information from the NEC4 output files and to convert the raw data into gains at various take-off angles to allow determination of the antenna gain for the system.

Three techniques were used to determine the array gain over the operating frequency range of the arrays:

  • A first-principles model prepared in MATLAB
  • An array model developed based on the methods proposed by Constantine Balanis in Antenna Theory and Design 3rd Edition
  • An array model developed based on Phased Array System Toolbox™

A comparison of each of the modelling approaches showed gains peaking at approximately 9 dB for the first principles approach and also for the Balanis approach, but a higher gain peaking at 10 dB for the phased array approach. This higher gain indicated an error in the model which needed to be investigated, as eight antenna elements should provide at best 9 dB of gain improvement, not 10 dB.

Following review with MathWorks experts, the phased array antenna element was changed from a phased.ShortDipoleAntennaElement to a phased.IsotropicAntennaElement to provide an isotropic radiation pattern to be consistent with the two other methods used for the analysis, and this approach provided almost identical results to the Balanis method. Taking this factor into account, we were able to confirm that even with the reduced array gain at the lower end of the band, the sites are externally noise-limited.

Greg Mew, Boeing Defence Australia

Combining MATLAB with Amazon Web Services to Produce Highly Scaled Cloud-Based Applications

12:00–12:30 p.m.

This presentation describes a multitiered system for performing and documenting engineering calculations on the Amazon Web Service cloud platform using MATLAB® as the core computational engine. Utilising MATLAB Compiler™, MATLAB Compiler SDK™, and Database Toolbox™, it is shown that MATLAB can facilitate high-performance, highly scalable, and fault-tolerant cloud applications. The presentation discusses the novel deployment architecture and illustrate the implementation of this technology via examples involving the design of foundations for supporting offshore oil and gas infrastructure. The outcome is demonstrably better engineering design, achieved more quickly. In addition to addressing technical issues associated with accessing computer power, cloud-based approaches streamline workflow, enabling a mobile workforce and enhancing collaboration across large organisations. Cloud-based approaches also protect valuable intellectual property, which is critical in driving investment in innovative engineering solutions.

James Doherty, The University of Western Australia

Hybrid Econometric Modelling to Evaluate Economic Impact

12:00–12:30 p.m.

In this presentation, QSPectral describe the use of machine learning to predict the effects of economic policy. The application of machine learning in econometrics is a relatively recent concept, mainly due to the requirement for identification of causal conditions.

Machine learning, data mining, and predictive analytics all use data to predict some variable as a function of other variables, the key aspects being:

  • Insight, importance, and patterns
  • Inference—how a dependent variable changes as some independent variable changes

Traditional econometrics uses statistical methods for prediction, inference, and causal modelling of economic relationships. Inference is the ultimate goal and, in particular, causal inference is a goal for economic decision-making. Conventional statistical and econometric techniques such as regression often work well but there are issues unique to large data sets that may require different tools.

QSPectral used MATLAB® and Statistics and Machine Learning Toolbox™ to develop a machine learning model based on ensembles of decision trees. Sanjeev presents the use of the tree-bagger algorithm in MATLAB to accomplish ensemble learning.

It is envisaged that in the future economic data involved may require more powerful data manipulation tools. The adaptive capability of this model addresses these emerging requirements. Large data sets may allow for more flexible relationships and machine learning techniques, such as decision trees, support vector machines, neural nets, deep learning, and so on, allow for more effective ways to model complex relationships.

Sanjeev also presents an interactive visualisation of the results where different what-if scenarios can be simulated to derive economic insights and drive investments and policy.

Sanjeev Naguleswaran, QSPectral Systems Pty. Ltd.

MATLAB and Simulink Code Generation for Industrial Inverters: Myths and Technique

12:00–12:30 p.m.

When developing a high-bandwidth distributed-control platform for high-power inverters, accurate simulation is paramount. When developing a modular scalable inverter platform for use in a wide range of products, engineer productivity is also paramount. Accurate simulation requires modelling both relevant hardware components and also having a one-to-one model of digital controllers and comms links, but what’s a suitable modelling approach that makes scaling up to multiple devices possible? Code generation tools promise increased developer productivity in several ways, but does trusting algorithm implementation to the machine generate less efficient code, or perhaps more efficient code? This presentation covers two areas: a concept we’ve evolved over the years for modelling continuous and discrete controllers, and insight into the code generation process in Simulink® to perhaps dispel some myths.

Robert Turner, ABB Limited

Rapid Algorithm Development for Planning and Control of an Actively Articulated Wheel-on-Leg Robot

12:00–12:30 p.m.

This presentation summarises the development of guidance, navigation, and control (GNC) software for an actively articulated wheel-on-leg rover. The Mars Analogue Multi-Mode Traverse Hybrid (MAMMOTH) quadruped is an 85 kg robot capable of changing its footprint, clambering over obstacles and reconfiguring its posture to meet sensing and traversability objectives. Due to the complexity of the GNC problem for this vehicle, a technique for efficient software development is required. To meet this requirement, Model-Based Design has been employed to rapidly develop and validate individual GNC algorithms within the executable prototype framework of the software.

The major software components discussed include the actuator and sensor interfaces, the kinematic controller used to independently control 11 degrees of freedom, the fusion of various localisation and mapping schemes, and the motion planner used to plan efficient paths through the rover’s complex configuration space.

An RGB-D Asus Xtion sensor used for simultaneous localisation and mapping(SLAM) is implemented using the Robotics Operating System (ROS), and is interfaced to with Robotics System Toolbox™. Results from various traverses in which the rover performs SLAM are discussed. Additionally, the fusion of inertial measurement unit data, wheel odometry, and laser range-finder data into the localisation scheme are summarised.

A kinematic model of the MAMMOTH rover is formulated using recursive kinematic propagation. The model expresses the relationships between the independently and dependently driven points of actuation. Demonstrations of the rover driving its 11 degrees of freedom in both simulation and on a Mars analogue terrain are provided.

The final topic discussed is the motion planning scheme used. The Open Motion Planning Library (OMPL) is used with kinematics C++ code generated in MATLAB to produce feasible and efficient paths. Motion planning is demonstrated in a variety of simulated challenging planetary analogue environments.

MATLAB® and Simulink® have been used to facilitate the integration and validation of these individual components within software-in-the-loop, hardware-in-the-loop, and fully deployed development environments. An example workflow of the development of an actively articulated suspension technique to keep a constant body pose as the rover traverses rough terrain is summarised to highlight how each development environment is utilised.

The resulting software has enabled the demonstration of the full capabilities of a novel planetary rover exploration platform. Results from autonomous actively articulated suspension trials and autonomous digging missions are presented. The central contribution of this work is a demonstration of a rapid software development workflow for a complex robotic system.

William Reid, Australian Centre for Field Robotics

Terabyte-Scale Analysis of Neural Signals with MATLAB Distributed Computing Server

12:00–12:30 p.m.

With the increasing prevalence of wearable and implantable medical technology, large-scale digital data sets are becoming increasingly common in the health sciences. A recent first-in-human device trial of an implanted seizure prediction system has produced the longest continuous recording of human brain-wave data, totalling over 100,000 electrode-hours at 400 Hz. This data has the potential to provide a unique insight to the mechanisms and statistics of epileptic events, as well as how the electrode-tissue interface changes over time.

Even basic data preprocessing creates a considerable computational burden when working with a data set of this scale. The calculation of more complex frequency domain features is not computationally feasible without parallel computing. A 256-worker MATLAB Distributed Computing Server™ and Parallel Computing Toolbox™ were used to process the data set. This feature extraction problem is well posed for parallel processing as the outputs are only a function of the input data and not previous outputs. As the number of workers is greater than number of electrodes, data was rearranged from dimension [channels x time (very long)] to [channels x job number x time (2 min. segment)]. A for loop was used to index the channel number and a parfor loop was used to iterate the time segments. This arrangement minimised communication overhead while utilising all available workers. A linear mixed-effects (LME) model in Statistics and Machine Learning Toolbox™ was used to determine a group-level linear trend.

This presentation highlights the feasibility of using MATLAB® to analyse big neural data. Parallel Computing Toolbox makes the analysis computationally feasible, and Statistics and Machine Learning Toolbox streamlines the modelling process. This approach is relevant for estimating the long-term stability for a range of bionic devices.

Ewan Nurse, NeuroEngineering Laboratory, The University of Melbourne

Object Recognition and Computer Vision using MATLAB and NVIDIA Deep Learning SDK

3:30–4:00 p.m.

Deep learning methods are providing algorithms to tackle some of the most complex problems in machine learning and artificial intelligence. Only recently has it become computationally feasible to apply these algorithms in the areas of computer vision and image recognition, for example. Today's most advanced graphics processors provide the processing power required to develop, train, and apply deep neural networks. In this presentation, Werner and Mike show how MATLAB and NVIDIA's Deep Learning SDK can be used for object recognition and computer vision using deep convolutional neural networks. They provide insight into the software framework as well as the hardware architectures and platforms for researchers, big data, as well as embedded solutions. The presentation concludes with an outlook to the next generation of GPUs.

Werner Scholz, XENON Systems
Mike Wang , NVIDIA

Deep Learning and Data Analytics with MATLAB

1:30–2:15 p.m.

Object recognition enables innovative systems such as self-driving cars, image-based retrieval, and autonomous robotics. The machine learning and deep learning these systems rely on can be difficult to train, evaluate, and compare. This session explores how MATLAB® addresses the most common challenges encountered while developing object recognition systems. This talk covers new capabilities for deep learning, machine learning, and computer vision.

David Willingham, MathWorks

Predictive Maintenance with MATLAB

2:15–3:00 p.m.

Companies that make industrial equipment are storing large amounts of machine data, with the notion that they will be able to extract value from it in the future. However, using this data to build accurate and robust models for prediction requires a rare combination of equipment, expertise, and statistical know-how.

In this session, David uses machine learning techniques in MATLAB® to estimate the remaining useful life of equipment. Using data from a real-world example, the session explores how MATLAB is used to build prognostic algorithms and take them into production, enabling companies to improve the reliability of their equipment and build new predictive maintenance services.

David Willingham, MathWorks

Model-Based Design: Design with Simulation in Simulink

1:30–2:15 p.m.

Join this session to discover how you can use Model-Based Design with MATLAB® and Simulink® to build a multidomain system model which can be used for early verification and system-level optimisation.

Through product demonstrations, see a high-level overview of the major capabilities and how you can use Simulink to design, simulate, and understand the dynamic behaviour of multidomain systems.

Ruth-Anne Marchant, MathWorks

Model-Based Design: Generating Embedded Code for Prototyping or Production

2:15–3:00 p.m.

Embedded code generation is fundamentally changing the way engineers work. Instead of writing thousands of lines of code by hand, engineers automatically generate their production code to increase productivity, improve quality, and foster innovation.

Ruth-Anne demonstrates how to speed up development time by automatically generating code that can be compiled and executed on target hardware.

Ruth-Anne Marchant, MathWorks